For #longtermists, there is nothing worse than succumbing to an #existential #risk: That would be the ultimate tragedy, since it would keep us from plundering our "#cosmic #endowment" — resources like stars, planets, asteroids and energy — which many longtermists see as integral to fulfilling our "longterm potential" in the universe.
What sorts of catastrophes would instantiate an existential risk? The obvious ones are nuclear #war, global #pandemics and runaway #climate #change. But Bostrom also takes seriously the idea that we already live in a giant computer #simulation that could get shut down at any moment (yet another idea that #Musk seems to have gotten from Bostrom).
Bostrom further lists "#dysgenic #pressures" as an existential risk, whereby less "intellectually talented" people (those with "#lower #IQs") outbreed people with #superior #intellects