Existential risk

Existential risk

In future studies, an existential risk is a risk that is both global (affects all of humanity) and terminal (destroys or irreversibly cripples the target). Nick Bostrom defines an existential risk as a risk "where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential." [citation|url=http://www.jetpress.org/volume9/risks.html|author=Nick Bostrom|journal= Journal of Evolution and Technology| Volume=9|date=March 2002|title=Existential Risks] The term is frequently used in transhumanist and Singularitarian communities to describe disaster and doomsday scenarios caused by bioterrorism, non-Friendly superintelligence, misuse of molecular nanotechnology, or other sources of danger.

Among the grimmest warnings of existential risks from advanced technology are those of computer scientist Bill Joy, who envisages the possibility of global destruction as new technologies become increasingly powerful and uncontrollable, and Martin Rees who has written about an extensive range of risks to human survival.

While transhumanism advocates the development of advanced technologies to enhance human physical and mental powers, transhumanist thinkers typically acknowledge that the same technologies could bring existential risks. Generally, transhumanism holds that the potential benefits are at least equal in scope and magnitude to the existential risks (or that the risky technology would be impossible to prevent regardless), and many transhumanists, including Bostrom, are actively engaged in consideration of how these risks might best be reduced or mitigated.

Joel Garreau's book "Radical Evolution" contains extensive discussion of possible existential risks (and possible radical benefits) from emerging technologies.

The articles risks to civilization, humans and planet Earth and human extinction list a number of potential existential risks.

Quote

: "Our approach to existential risks cannot be one of trial-and-error. There is no opportunity to learn from errors. The reactive approach – see what happens, limit damages, and learn from experience – is unworkable. Rather, we must take a proactive approach. This requires foresight to anticipate new types of threats and a willingness to take decisive preventive action and to bear the costs (moral and economic) of such actions."::: -- Nick Bostrom

ee also

* Fermi paradox
* Safety of particle collisions at the Large Hadron Collider
* Organizations formed to prevent or mitigate existential risks
** Center for Responsible Nanotechnology - "for safe, efficient nanotechnology and Blue Goo"
** Singularity Institute for Artificial Intelligence - "for Friendly AI to help us avoid existential risks"
** Lifeboat Foundation - "for the creation of an extra-terrestrial habitat to house reserve population until space colonisation begins"
** Foresight Institute - "for safe nanotechnology and a society prepared to handle the consequences of such"
** Center for Genetics and Society - "for the relinquishment of genetic technologies which may irrevocably change the definition of "human"
** Svalbard Global Seed Vault - "a doomsday seedbank to prevent important agricultural and wild plants from becoming rare or extinct in the event of a global disaster"

References

External links

* Articles and Essays
** [http://www.nickbostrom.com/existential/risks.html Existential Risks: Analyzing Human Extinction Scenarios] - The original essay by Nick Bostrom
** [http://www.nickbostrom.com/astronomical/waste.html Astronomical Waste: The Opportunity Cost of Delayed Human Development] – A paper by Nick Bostrom on the ethical importance of existential risk
** [http://www.singinst.org/ourresearch/publications/cognitive-biases.pdf Cognitive biases potentially affecting judgment of global risks] - A paper by Eliezer Yudkowsky discussing how various observed cognitive biases hamper our judgement of existential risk.
** [http://www.wired.com/wired/archive/8.04/joy_pr.html Why the future doesn't need us] , "Wired", April 2000 - Bill Joy's influential call to relinquish dangerous technologies.

Bibliography

* Joel Garreau, "Radical Evolution", 2005
* Martin Rees, "Our Final Hour" (UK title: "Our Final Century"), 2003, ISBN 0-465-06862-6


Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • Existential — may refer to:*Existential clause *Existential crisis *Existential fallacy *Existential humanism *Existential forgery *Existential risk *Existential therapy *Existential graph *Existential phenomenology *Existential quantification *Existentialism… …   Wikipedia

  • Lifeboat Foundation — The Lifeboat Foundation is a non profit organization with an explicit mandate of helping humanity survive existential risks and anticipate possible misuse of increasingly powerful technologies, including genetic engineering, nanotechnology, and… …   Wikipedia

  • Nick Bostrom — at the 2006 Singularity Summit. Nick Bostrom (born Niklas Boström on 10 March 1973[1]) is a Swedish philosopher at the University of Oxford known for his work on existential ri …   Wikipedia

  • Risks to civilization, humans and planet Earth — This article is about the near and very far future. For past civilizations, see Societal collapse. Contents 1 Types of risks 2 Future scenarios 2.1 …   Wikipedia

  • Transhumanism — This article is about the futurist ideology and movement. For the critique of humanism, see posthumanism …   Wikipedia

  • Transhumain — Transhumanisme ██████████ …   Wikipédia en Français

  • Transhumaniste — Transhumanisme ██████████ …   Wikipédia en Français

  • Friendly artificial intelligence — Laws of robotics Three Laws of Robotics by Isaac Asimov (in culture)  · Tilden s Laws of Robotics by Mark Tilden …   Wikipedia

  • Differential technological development — is a strategy proposed by transhumanist philosopher Nick Bostrom in which societies would seek to influence the sequence in which emerging technologies developed. On this approach, societies would strive to retard the development of harmful… …   Wikipedia

  • Voluntary Human Extinction Movement — The Voluntary Human Extinction Movement, or VHEMT (pronounced ) [http://www.vhemt.org/aboutvhemt.htm (accessed 2007 04 15)] , is a movement which calls for the voluntary self extinction of the human species. Basic conceptThe basic concept behind… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”