Cause prioritization for downside-focused value systems
Effective Altruism Forum, January 31, 2018
Abstract
Cause prioritization for downside-focused value systems presents unique challenges due to their concern with reducing disvalue. This study analyzes the implications of such systems in the context of the long-term future, proposing a distinction between downside-focused and upside-focused views based on their assessment of the likelihood and severity of catastrophic risks. It examines various extinction and existential risks, including biorisk, nuclear war, and superintelligence misalignment, framing them as s-risks (scenarios that bring about vast amounts of disvalue) and as potential cures or causes for s-risks. The study argues that while prioritizing s-risk reduction is generally positive for downside-focused views, working to prevent extinction is not a promising intervention, particularly if it increases the likelihood of space colonization and transformative technologies that could pose s-risks. Instead, the study recommends focusing on interventions targeting the prevention of s-risks, such as efforts to reduce AI alignment failure modes. Cooperation and attention to moral uncertainty are also emphasized as important considerations for effective cause prioritization. – AI-generated abstract.
