quotes
Wei Dai – Some thoughts on singularity strategies Wei Dai Some thoughts on singularity strategies online

My own feeling is that the chance of success of building FAI, assuming current human intelligence distribution, is low (even if given unlimited financial resources), while the risk of unintentionally building or contributing to UFAI is high. I think I can explicate a part of my intuition this way: There must be a minimum level of intelligence below which the chances of successfully building an FAI is negligible. We humans seem at best just barely smart enough to build a superintelligent UFAI. Wouldn’t it be surprising that the intelligence threshold for building UFAI and FAI turn out to be the same?

Wei Dai, Some thoughts on singularity strategies, LessWrong, July 12, 2011