Introducing the Fund for Alignment Research (We're hiring!)
Effective Altruism Forum, July 6, 2022
Abstract
The Fund for Alignment Research (FAR) is hiring research engineers and communication specialists to work closely with AI safety researchers. We believe these roles are high-impact, contributing to some of the most interesting research agendas in safety. We also think they offer an excellent opportunity to build skills and connections via mentorship and working closely with researchers at a variety of labs.