The Precipice

Founder's Bookshelf / Book

The Precipice

Existential Risk and the Future of Humanity

Book by Toby Ord

Oxford philosopher Toby Ord estimates the probability that humanity will suffer an existential catastrophe this century at roughly one in six. That is the same odds as Russian roulette. This book explains how he arrived at that number and what might be done about it.

*post may include affiliate links, view our Disclaimer for more info.

About The Precipice

Toby Ord is a moral philosopher at Oxford who has spent his career thinking about what we owe to future generations. In 2009, he co-founded Giving What We Can, pledging to donate at least ten percent of his income to effective charities. The Precipice extends that concern forward in time: what do we owe to the potentially trillions of people who will exist if humanity survives, and what threatens to prevent their existence?

Ord estimates that the probability of an existential catastrophe, one that either destroys humanity entirely or permanently cripples its potential, is approximately one in six over the next century. He arrives at this by assessing individual risks: asteroid impact (very low), supervolcanic eruption (very low), nuclear war (around one percent per century), engineered pandemics (around three percent), and unaligned artificial intelligence (around ten percent). AI dominates his risk assessment, and he is explicit about the reasoning: a technology that could exceed human intelligence in all domains poses a risk unlike any humanity has previously faced.

The book is structured in three parts. The first covers the history of existential risk, from natural catastrophes that have threatened species throughout Earth’s history to the nuclear weapons that gave humanity the power to destroy itself for the first time. The second assesses current and emerging risks, including climate change (which Ord considers serious but unlikely to be existential), biotechnology, and AI. The third discusses what can be done: institutional reforms, research priorities, and the moral framework for taking these risks seriously.

Ord writes with the measured precision of an academic philosopher, but the material itself provides the urgency. The sections on engineered pandemics, written before COVID-19 demonstrated how poorly prepared the world was for a natural one, read as especially prescient. The AI sections anticipate much of the alignment debate that has since become mainstream.

The book includes extensive endnotes and a detailed appendix of probability estimates that Ord invites readers to challenge. He is transparent about the uncertainty in his numbers, which is both honest and slightly terrifying. Even if you think his estimates are too high by a factor of three, the resulting probability is still alarming.

The Precipice was published in March 2020, the same month the world shut down for COVID-19. The timing was coincidental but fitting. Ord had spent years arguing that humanity was not taking large-scale risks seriously enough. The pandemic proved him right about preparedness, if nothing else.