works
Scott Alexander Yudkowsky contra Christiano on AI takeoff speeds online The article discusses the likelihood of a sudden and catastrophic AI “takeoff” compared to a gradual, more predictable evolution of AI. It presents a debate between Eliezer Yudkowsky, who advocates for a fast takeoff, and Paul Christiano, who argues for a slower trajectory. Yudkowsky emphasizes the potential for discontinuous leaps in AI capability, drawing analogies to past technological breakthroughs like the nuclear bomb. Christiano focuses on the historical patterns of gradual progress in technology and economics, suggesting that AI advancement will follow similar trends. The authors examine the role of recursive self-improvement, government regulation, and the limitations of analogy in predicting future AI development. They also consider the concept of intelligence as a scalar quantity and discuss potential bottlenecks to AI progress beyond mere computational power. – AI-generated abstract.

Yudkowsky contra Christiano on AI takeoff speeds

Scott Alexander

Astral Codex Ten, April 4, 2022

Abstract

The article discusses the likelihood of a sudden and catastrophic AI “takeoff” compared to a gradual, more predictable evolution of AI. It presents a debate between Eliezer Yudkowsky, who advocates for a fast takeoff, and Paul Christiano, who argues for a slower trajectory. Yudkowsky emphasizes the potential for discontinuous leaps in AI capability, drawing analogies to past technological breakthroughs like the nuclear bomb. Christiano focuses on the historical patterns of gradual progress in technology and economics, suggesting that AI advancement will follow similar trends. The authors examine the role of recursive self-improvement, government regulation, and the limitations of analogy in predicting future AI development. They also consider the concept of intelligence as a scalar quantity and discuss potential bottlenecks to AI progress beyond mere computational power. – AI-generated abstract.

PDF

First page of PDF