works
berglund Refining the Evolutionary Analogy to AI online The argument that AI development will mirror the sharp discontinuity in human evolutionary capabilities relies on the premise that a marginal increase in intelligence can trigger an exponential leap in real-world effectiveness. However, evidence from evolutionary anthropology suggests that the human transition from the savanna to advanced civilization was driven by a shift in the mechanism of improvement—moving from biological evolution to cultural evolution—rather than a mere increase in individual cognitive capacity. This transition enabled the high-fidelity transmission of cumulative knowledge across generations, which fundamentally accelerated the rate of progress. For the evolutionary analogy to hold for artificial intelligence, a similar shift in the mechanism of capability acquisition must occur. Potential candidates for such a shift include recursive self-improvement, where an agent iteratively optimizes its own source code, or the emergence of high-speed social and technical coordination between AI systems. If the projected discontinuity depends on these specific mechanistic transitions rather than an inherent threshold of general intelligence, the predictive power of the evolutionary analogy is diminished. – AI-generated abstract.

Refining the Evolutionary Analogy to AI

berglund

LessWrong, August 7, 2020

Abstract

The argument that AI development will mirror the sharp discontinuity in human evolutionary capabilities relies on the premise that a marginal increase in intelligence can trigger an exponential leap in real-world effectiveness. However, evidence from evolutionary anthropology suggests that the human transition from the savanna to advanced civilization was driven by a shift in the mechanism of improvement—moving from biological evolution to cultural evolution—rather than a mere increase in individual cognitive capacity. This transition enabled the high-fidelity transmission of cumulative knowledge across generations, which fundamentally accelerated the rate of progress. For the evolutionary analogy to hold for artificial intelligence, a similar shift in the mechanism of capability acquisition must occur. Potential candidates for such a shift include recursive self-improvement, where an agent iteratively optimizes its own source code, or the emergence of high-speed social and technical coordination between AI systems. If the projected discontinuity depends on these specific mechanistic transitions rather than an inherent threshold of general intelligence, the predictive power of the evolutionary analogy is diminished. – AI-generated abstract.

PDF

First page of PDF