Typology of Human Extinction Risks
Typology of Human Extinction Risks
Typology of Human Extinction Risks
Possible timeline
Main adverse
factor of
the
catastrophe
of
Human
Extinction
Risks
100 000 BC
20 century
2030
2050
3000
Natural
risks
Natural risks
activated by
human activity
Based on
known
technologies
Based on
new
technologies
Superintelligence
as final
technology
Remote and
hypothetical
risks
Explosions
Energy
Nuclear weapons
Nuclear war
Artificial explosion of
supervolcanos
Nuclear winter
Artificial nuclear winter by
explosions in forests
Artificial runaway global
warming by nuclear explosions on
Arctic shelf and methane release
War with AI
Different types of AI
Friendliness
Several AI fight each other for
world dominance
Nuclear war against AI or against
attempts to create it
Military drones rebell
Unfriendly AI destroy
humanity
Intelligence
Disjunction
Impairment of intelligence
As a risk to this AI
To get resources for his goals
(Paperclip maximaizer)
Realize incorrectly formulated
friendliness (Smile maximizer)
Fatal error in late stage AI (AI
halting)
Synthetic biology
Organisms
Replication
Pandemic
Super pest
Dangerous predator
Dangerous carrier
(mosquitoes)
Atmosphere
Poisoning
Combined
scenarios
Probability
Biological weapons
Overpopulation followed by
collapse
Genetic degradation
Ecology
The accumulation of new toxins
in the biosphere and its collapse
Resource depletion
Chain of
natural disasters
Epidemic followed by
degradation and extinction
Degeneration of ability of the
biosphere to regenerate, then loss
of the technology and starvation
Death of crops from superpest,
than hunger and war
Global contamination
Deliberated chemical
contamination (Doixin build-up and
release)
Deliberate destruction of all
nuclear reactors using nuclear weapons
Autocatalytic reaction like Ice-9,
artificial prions
War as a trigger
Nuclear war leads to use or release
of biological weapons
World War leads to arms race and
creation of Doomsday Machines
System crisis caused by many
factors
The roadmap is based on the book Risks of human extinction in the 21 century (Alexey Turchin and Michael Anissimov).
Most important risks are in bold. Most hypothectical are italic. This roadmap is accompained by another one about the ways of prevention of existential risks.
Alexey Turchin, 2015, GNU-like license. Free copying and design editing, discuss major updates with Alexey Turchin.
http://immortality-roadmap.com/ last updated version here
Proofreading and advise: Michael Anissimov
0.1 1%
a year
Nanotech
10 30%
total
Extraterrestrial robots
ET nanorobots-replicators
ET robots-berserkers, killing
civilization after certain threshold
Whimpers
(values contamination)
Value system of posthumanity
moves away from ours
Evil AI, which goal is to maximize
suffering of as much humans as
possible (worse than I Have No
Mouth, and I Must Scream)
50%
total
Complexity crisis
Unpredictably, chaos, and black
swans lead to catastrophe
Small