Announcing the Founders Pledge Global Catastrophic Risks Fund
Effective Altruism Forum, October 26, 2022
Abstract
The GCR Fund will build on Founders Pledge’s recent research into great power conflict and risks from frontier military and civilian technologies, with a special focus on international stability — a pathway that we believe shapes a number of the biggest risks facing humanity — and will work on:. War between great powers, like a U.S.-China clash over Taiwan, or U.S.-Russia war;Nuclear war, especially emerging threats to nuclear stability, like vulnerabilities of nuclear command, control, and communications;Risks from artificial intelligence (AI), including risks from both machine learning applications (like autonomous weapon systems) and from transformative AI;Catastrophic biological risks, such as naturally-arising pandemics, engineered pathogens, laboratory accidents, and the misuse of new advances in synthetic biology; andEmerging threats from new technologies and in new domains. Moreover, the Fund will support field-building activities around the study and mitigation of global catastrophic risks, and methodological interventions, including new ways of studying these risks, such as probabilistic forecasting and experimental wargaming. The focus on international security is a current specialty, and we expect the areas of expertise of the fund to expand as we build capacity.
