top of page

Were the Great Tragedies of History “Mere Ripples”?

The Case Against Longtermism

In this mini-book, I argue that "longtermism"—an ideology closely associated with existential risks and championed by some leading "Effective Altruists"—is extremely dangerous. Written for students, journalists, and academics, I examine the origins of this ideology and how it could be used to justify a wide range of atrocities in the name of attaining a techno-utopian world in which astronomical amounts of "value" flood the visible universe.

 

According to this view, which has roots in the work of Nick Bostrom, a philosopher who founded the Future of Humanity Institute, the ultimate moral task of humanity is to subjugate nature, maximize economic productivity, colonize space, build vast computer simulations, create huge numbers of artificial beings in simulated worlds, and replace humanity with a superior race of radically "enhanced" posthumans. It implies that the worst atrocities in human history fade into moral nothingness when one takes the big-picture view of our "potential," that preemptive war is sometimes acceptable, that mass invasive surveillance may be necessary to avoid omnicide, and that we should probably give to the rich instead of the poor. It possesses many of the hallmarks of a millennialist movement, making it vulnerable to rapid shifts from a "passive" to an "active" mode, whereby nothing is off the table for the sake of "the greater cosmic good."

 

Evangelists for this view are careful not to mention these features of the longtermist ideology, which is one reason that I have written this mini-book. By laying bare the underlying commitments and unsavory implications of longtermism—at least in its most influential guise—I hope readers will realize just how misguided this view is. There are better ways to care about the future of humanity.

You can download a free copy of the mini-book below as a PDF or EPUB file below, or read the mini-book on Medium here.

bottom of page