n. A danger that could lead to human extinction.
EA Global was dominated by talk of existential risks, or X-risks. The idea is that human extinction is far, far worse than anything that could happen to real, living humans today.
"Once I identified that this is the most important thing I could do with my life, I started consciously looking for opportunities to push this X-risk ecosystem forward," says Tallinn. He has helped start the Centre for the Study of Existential Risk and the Future of Life Institute, and contributes about $500,000 each year to research.
—Angela Chen, “Is Artificial Intelligence a Threat?,” The Chronicle of Higher Education, September 11, 2014
Suppose you think that reducing the risk of human extinction is the highest-value thing you can do. Or maybe you want to reduce "x-risk" because you're already a comfortable First-Worlder like me and so you might as well do something epic and cool, or because you like the community of people who are doing it already, or whatever.
—lukeprog, “How can I reduce existential risk from AI?,” LessWrong, November 13, 2012
2007 (earliest)
—“Bostrom talk on X-Risk and AI transcribed,” Institute for Ethics and Emerging Technologies, September 08, 2007
Filed Under