"If machine brains one day come to surpass human brains in general intelligence, then this new superintelligence could become very powerful": First Edition of Nick Bostrom's Superintelligence; Inscribed by Him

  • Superintelligence: Paths, Dangers, Strategies.

Superintelligence: Paths, Dangers, Strategies.


Item Number: 88103

Oxford: Oxford University Press, 2014.

First edition of Swedish philosopher Nick Bostrom’s bestselling treatise on the catastrophic risks of artificial intelligence. Octavo, original cloth. Inscribed by the author on the half-title page, “Aaron – Thanks for coming to the talk! Nick Bostrom.” Near fine in a near fine dust jacket. Jacket design by Nick Bostrom, partly based on an image by Claire Scully.

Swedish Professor of Philosophy at Oxford University, Nick Bostrom is the author of some 200 publications including Anthropic Bias (2002), Global Catastrophic Risks (2008), and Superintelligence (2014), a New York Times bestseller. His work on superintelligence and his concern for its existential risk to humanity has influenced both prominent figures Elon Musk and Bill Gates to similar thinking. In Superintelligence, Bostrom reasoned "the creation of a superintelligent being represents a possible means to the extinction of mankind." Bostrom theorized that this scenario would likely occur with the human creation of a machine with general intelligence far below to human level, but with superior mathematical abilities.

Ask a Question