This document contains a lecture by Professor Nick Bostrom on existential risks and the future of humanity. It discusses potential catastrophic risks facing humanity, both natural and anthropogenic. It also examines possibilities for revolutionary technologies like artificial intelligence and nanotechnology, as well as scenarios for recurrent collapse or stagnation. The document places human history and technological development in a very long-term context, and speculates on what a "posthuman" civilization may look like if humanity survives existential risks.
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Professor Bostrom on Existential Risks and the Future of Humanity
1. Professor Nick Bostrom Faculty of Philosophy Director, Future of Humanity Institute James Martin 21 st Century School Oxford University The Big Picture
2.
3.
4. The future of humanity? ? Time 2007 Technological development pre-human condition human condition posthuman condition
5. Extinction Time 2007 Technological development pre-human condition human condition posthuman condition
6. Past catastrophes killing >10 million 15 20C First World War 17 19C British India (mainly famine) 17 14C-15C Timur Lenk 18 15C-19C Atlantic Slave Trade 19 7C-19C Mideast Slave Trade 20 20C Stalin (famines and purges) 20 15C-19C Decimation of the American Indians 20 19C Taiping Rebellion (1851–1864) 24 20C Chinese Famine of 1907 25 17C Fall of the Ming Dynasty 27 (20C-21C) HIV/AIDS 36 8C An Shi Rebellion (756–763) 40 13C Mongol conquests 40 20C Spanish flu pandemic (1918-1919) 40 20C Great Leap Forward in China (famine) 55 20C Second World War 75 14C Black death (1347-1350) 100 6C Plague of Justinian 400 20C Smallpox Deaths (millions) Century Catastrophe
7. Scope Intensity Loss of one hair Congestion from one extra vehicle 0.01 degree global warming Loss of one species of beetle Car is stolen Recession in a country Destruction of the ozone layer Drastic loss of biodiversity Fatal car crash Genocide Ageing? Human extinction Personal Global Trans-generational Local Imperceptible Endurable Terminal (Hellish?) (Cosmic?) Global Catastrophic Risks Existential Risks
8.
9.
10. Some opinions on net existential risk “ significant” Human extinction this century Richard Posner 0.25% Probability of humanity extinct <5,100 yrs Richard Gott ≥ 25% Cumulative existential risk (no time limit) Early Bostrom (2002) 50% End of civilization by 2100 (note: this need not entail extinction or existential catastrophe) Martin Rees 30% Human extinction by 2496 (based partly on the doomsday argument and Leslie’s view about how quantum indeterminacy affects this argument) John Leslie (1996) 0.1% Extinction risk per year Stern Report 19% Median answer to “Overall risk of human extinction prior to 2100” GCR conference poll Probability Claim Source
11.
12. Other Human infertility Space radiation Other climate change Back-contamination Kinetic impact Non-anthropogenic vaccuum decay Extraterrestrial intelligence Physics experiment Supervolcanic eruption Self-destroying superintelligent AI Nonspecific conflict Emissions-caused global warming Natural pandemic Non-weapons nanotech accident Genetically engineering / synthetic biology Simulation shutdown Nuclear holocaust Nanotech weapons systems HUMAN EXTINCTION RISKS?
13. Stagnation or Plateau Time 2007 Technological development pre-human condition human condition posthuman condition
14.
15.
16.
17.
18.
19. Recurrent collapse Time 2007 Technological development pre-human condition human condition posthuman condition
20. The longer term Time 2007 Technological development posthuman condition pre-human condition