works
Scott Alexander Biological anchors: a trick that might or might not work online The article explores the biological anchors method, a way of estimating the arrival of artificial general intelligence (AGI) by looking at how much computational power is needed to replicate human-level intelligence. It contrasts the method with the more cautious approach of Eliezer Yudkowsky, who argues that relying on biological anchors is flawed because it assumes that AGI will be created through the same process as evolution, which is not necessarily true. Yudkowsky argues that paradigm shifts in AI research are more likely to lead to AGI, and that these shifts are not predictable. The article discusses the debate between proponents of both approaches, and ultimately argues that while the biological anchors method is flawed, it can be seen as a useful lower bound for the time until the arrival of AGI. – AI-generated abstract.

Biological anchors: a trick that might or might not work

Scott Alexander

Astral Codex Ten, February 23, 2022

Abstract

The article explores the biological anchors method, a way of estimating the arrival of artificial general intelligence (AGI) by looking at how much computational power is needed to replicate human-level intelligence. It contrasts the method with the more cautious approach of Eliezer Yudkowsky, who argues that relying on biological anchors is flawed because it assumes that AGI will be created through the same process as evolution, which is not necessarily true. Yudkowsky argues that paradigm shifts in AI research are more likely to lead to AGI, and that these shifts are not predictable. The article discusses the debate between proponents of both approaches, and ultimately argues that while the biological anchors method is flawed, it can be seen as a useful lower bound for the time until the arrival of AGI. – AI-generated abstract.

PDF

First page of PDF