 FHI TECHNICAL REPORT 

Global Catastrophic Risks Survey Anders Sandberg Nick Bostrom Technical Report #2008-1

Cite as: Sandberg, A. & Bostrom, N. (2008): “Global Catastrophic Risks Survey”, Technical Report #2008-1, Future of Humanity Institute, Oxford University: pp. 1-5.

The views expressed herein are those of the author(s) and do not necessarily reflect the views of the Future of Humanity Institute.

G LOBAL  C ATASTROPHIC  R ISKS  S URVEY     (2008)  Technical Report 2008/1  Published by Future of Humanity Institute, Oxford University  Anders Sandberg and Nick Bostrom    At  the  Global  Catastrophic  Risk  Conference  in  Oxford  (17‐20  July,  2008)  an  informal  survey  was  circulated  among  participants,  asking  them  to  make  their  best  guess  at  the  chance that there will be disasters of different types before 2100. This report summarizes  the main results.    The median extinction risk estimates were:    Risk  At least 1 million  At least 1 billion  Human extinction  dead  dead  Number killed by  25%  10%  5%  molecular nanotech  weapons.  Total killed by  10%  5%  5%  superintelligent AI.  Total killed in all  wars (including  civil wars).  Number killed in  the single biggest  engineered  pandemic.  Total killed in all  nuclear wars. 

98% 

30% 

4% 

30% 

10% 

2% 

30% 

10% 

1% 

Number killed in  5%  1%  0.5%  the single biggest  nanotech accident.  Number killed in  60%  5%  0.05%  the single biggest  natural pandemic.  Total killed in all  15%  1%  0.03%  acts of nuclear  terrorism.  Overall risk of  n/a  n/a  19%  extinction prior to  2100    These  results  should  be  taken  with  a  grain  of  salt.  Non‐responses  have  been  omitted,  although  some  might  represent  a  statement  of  zero  probability  rather  than  no  opinion. 

 

1

There are likely to be many cognitive biases that affect the result, such as unpacking bias  and the availability heuristic‒‐well as old‐fashioned optimism and pessimism.     In appendix A the results are plotted with individual response distributions visible.   

Other Risks  The  list  of  risks  was  not  intended  to  be  inclusive  of  all  the  biggest  risks.  Respondents  were  invited  to  contribute  their  own  global  catastrophic  risks,  showing  risks  they  considered significant. Several suggested totalitarian world government, climate‐induced  disasters,  ecological/resource  crunches  and  “other  risks”‒‐specified  or  unknowable  threats.  Other  suggestions  were  asteroid/comet  impacts,  bad  crisis  management,  high‐ tech asymmetric war attacking brittle IT‐based societies, back‐contamination from space  probes,  electromagnetic  pulses,  genocide/democides,  risks  from  physics  research  and  degradation of quality assurance.    

Suggestions  Respondents were also asked to suggest what they would recommend to policymakers.  Several  argued  for  nuclear  disarmament,  or  at  least  lowering  the  number  of  weapons  under  the  threshold  for  existential  catastrophe,  as  well  as  reducing  stocks  of  highly  enriched uranium and making nuclear arsenals harder to accidentally launch.     One  option  discussed  was  formation  of  global  biotech‐related  governance,  legislation  and enforcement, or even a global body like the IPCC or UNFCCC to study and act on  catastrophic  risk.  At  the  very  least  there  was  much  interest  in  developing  defenses  against  misuses  of  biotechnology,  and  a  recognition  for  the  need  of  unbiased  early  detection systems for a variety of risks, be they near Earth objects or actors with WMD  capabilities.    Views on emerging technologies such as nanotech, AI, and cognition enhancement were  mixed:  some  proposed  avoiding  funding  them;  others  deliberate  crash  programs  to  ensure they would be in the right hands, the risks understood, and the technologies able  to be used against other catastrophic risks.    Other  suggestions  included  raising  awareness  of  the  problem,  more  research  on  cyber  security  issues,  the  need  to  build  societal  resiliency  in  depth,  prepare  for  categories  of  disasters rather than individual types, building refuges and change energy consumption  patterns.    

Appendix A  Below are the individual results, shown as grey dots (jittered for distinguishability) and  with the median as a bar. 

 

2

Total killed in  all acts of  nuclear  terrorism.  >1 million  dead: median  15%  >1 billion dead:  median 1%  Extinction:  median 0.03%    

 

Total killed in  all nuclear  wars.  >1 million  dead: median  30%  >1 billion dead:  median 10%  Extinction:  median 1%    Number killed  in the single  biggest natural  pandemic.  >1 million  dead: median  60%  >1 billion dead:  median 5%  Extinction:  median 0.05%   

 

3

Number killed  in the single  biggest  engineered  pandemic.  >1 million  dead: median  30%  >1 billion dead:  median 10%  Extinction:  median 2%   

Total killed by  superintelligent  AI.   >1 million  dead: median  10%  >1 billion dead:  median 5%  Extinction:  median 5%    Number killed  in the single  biggest  nanotech  accident.   >1 million  dead: median  5%  >1 billion dead:  median 1%  Extinction:  median 0.5%   

 

4

Number killed  by molecular  nanotech  weapons.   >1 million  dead: median  25%  >1 billion dead:  median 10%  Extinction:  median 5%    Total killed in  all wars  (including civil  wars).   >1 million  dead: median  98%  >1 billion dead:  median 30%  Extinction:  median 4%   

Total risk of  extinction:  median 19% 

     

5

Global Catastrophic Risks Survey - Future of Humanity Institute

Report #2008-1, Future of Humanity Institute, Oxford University: pp. 1-5. ... survey was circulated among participants, asking them to make their best guess at the chance that ... Several suggested totalitarian world government, climate-induced.

329KB Sizes 2 Downloads 207 Views

Recommend Documents

Existential Risk - Future of Humanity Institute - University of Oxford
finkel, Professor Timo Goeschl, Professor Lawrence. Gostin, Dr. Petri Hakkarainen, Dr. Alan W. Harris,. Professor Alan Harris, Dr. Hauke Hildebrandt, Dr.

Existential Risk - Future of Humanity Institute - University of Oxford
and policy-makers out of more than 100 proposals emerged from three ... require increasing levels of trust and internation- al collaboration ... al, and should prepare for low-probability high-im- pact scenarios of ... Security Policy. In addition, w

Survey - Employee Benefit Research Institute
Mar 22, 2016 - Among Americans who know they are saving less than they need for retirement, about 20 percent say they will have to save more later, while ...

Meet the next generation of global leaders - Humanity in Action
USA. Benjamin. Harris,. USA. Chosen for one of the most selective postgraduate programs in the world, this ... at the prestigious Tsinghua University in Beijing.

Chances and Risks of Global Branding: Placing ...
(http://www.onlineessays.comessaysbusinessbus013.php, 13/8/2004). Clustering will be ...... These countries are labeled as Advanced Stage Middle Eastern Arab. States. ... Some authors even define culture as 'the collective programming of.

Chances and Risks of Global Branding: Placing ...
Higher Education for the degree of Master of Science ...... Several religious beliefs affect the administration of healthcare services, and some times the ...

Cyber, Nano, and AGI Risks: Decentralized ... - Foresight Institute
Brin, David. 1998. ​The Transparent Society. .... ​Artificial Intelligence: A Modern Approach.​ Pearson. Education Limited. Simpson, Corbin; Short Allen. “Monte ...

Shifting gears technology humanity the future shared deck gerd ...
Shifting gears technology humanity the future shared deck gerd leonhard DHL Global Tech Conf 2017.pdf. Shifting gears technology humanity the future shared ...

Sensory Transformation - Institute for the Future
foreshadow the probable popularity of next-generation memory and concentra- tion drugs. ... practices that will be important over the next ten years and some of the implica- ... of everyday living, will lead to one thing for certain: more media. ....

Internet human - Institute for the Future
Internet users of the next decade will access computational networks using multiple connected .... New user agreements and Internet service provider codes ..... er 3 y e ar s, m o re th a n 2. 0. 0 te ra b y te s o f d a ta. , c o m p le te ly d o c

Internet human - Institute for the Future
Meta application software will stitch together ... given task. Over the next decade, software technologies leveraging mobile connectivity between ..... apple.com.

pdf-1452\crimes-against-humanity-the-struggle-for-global-justice ...
Connect more apps... Try one of the apps below to open or edit this item. pdf-1452\crimes-against-humanity-the-struggle-for-global-justice-3rd-third-edition.pdf.

Humanity - GitHub
Stretch Goals. ○ Potential to be scalable across other all NASA datasets and resources. ○ Potential to shift behaviour of consumers through accessibility and engagement. ○ Voice control. ○ Direct OS integration - Siri, Amazon Alexa, Cortana.

World Champion - Global Institute of Sustainability - Arizona State ...
Page 2 ..... housing, control over resources, and a healthy environment, with issues of conservation, indigenous populations ... emissions, renewable energy, alternative transportation, water management and community resilience.

World Champion - Global Institute of Sustainability - Arizona State ...
Page 2 ..... housing, control over resources, and a healthy environment, with issues of conservation, indigenous populations ... emissions, renewable energy, alternative transportation, water management and community resilience.