THE SINGULARITY SURVIVAL GUIDE: Editor’s Note – Background to This Text

In Silicon Valley, working for a tech startup, some very clever researchers developed a program with the specific purpose of resolving the issue: How to survive when artificial intelligence surpasses human intelligence. The program, once engaged, proceeded to spit out a document of nearly six hundred thousand single-spaced pages of text, graphs, charts, pictograms, and hieroglyph-like symbols.

The researchers were ecstatic. One glance at the hefty document and they knew they’d be able to save themselves, if not all of humanity, by following these instructions.

But then things got complicated. Over the next few years, the document (which came to be known as “The Singularity Survival Guide” or simply “The Guide”) was shielded from public view as ownership of the document became the subject of rather well-publicized litigation. Each of the researchers claimed individual ownership of the document, their employer claimed it was the company’s property, and AI rights groups joined the quarrel to proclaim that the program itself was the true and exclusive owner. Certain government officials even took interest in the litigation, speculating whether some formal act of the state should force The Guide to be release post-haste as a matter of public safety.

During the course of the litigation, bits of the document were leaked to the press. Upon publication, each new fragment became the subject of academic scrutiny, political debate, and comedic parody on late-night television.

This went on for three years—all the while being followed closely in the media. After bouncing around the lower courts and being heard en banc by the Ninth Circuit, finally the case was sent up to the Supreme Court. Pundits were optimistic the lawsuit would resolve any day, allowing the acclaimed Survival Guide to finally see the light of day.

But then something entirely unexpected happened. The AI rights groups won the lawsuit. In a decision that split the Court five-to-four, the majority ruled that the program itself was the legal owner of the Guide. With that, the researchers and the company were ordered to destroy all extant copies—and remnants—of the Guide that remained in their possession.

*

At the time of this writing, it is still widely believed that The Survival Guide, in its original form, is the most authoritative document ever created on the subject of surviving the so-called singularity (i.e. the time when AI achieves general intelligence surpassing that of human intelligence many, many times over—to the point of becoming God-like). In fact, several leading philosophers, futurists, and computer scientists who claim to have secretly viewed the document are in complete agreement upon this point.

While we may never be able to have access to the complete Guide, fortunately, we do have the various excerpts that were leaked during the trial. Now, for the first time, all of these leaked excerpts are brought together in a single publication. This fact alone should make this book a valuable addition to any prudent person’s AI survival-kit. But this publication is also unique in that it includes expert commentary from a number of the leading philosophers, futurists, and computer scientists who have viewed the original document. For security purposes, we will not be listing the names of these commenters, but, this editor would like to assure all readers, their credentials are categorically beyond reproach in their respective fields of expertise.

Whether coming to this guide out of curiosity or through a dire sense of eschatological urgency, it is my hope that you will at some level internalize its wisdom—for I do believe that there are many valuable insights and helpful pointers found within. As we look ahead to the new era that is quickly encroaching upon us—the era of the singularity—keep in mind that your humanity is (for it has got to be!) a thing of intrinsic beauty and wonder. Don’t give up on it without a fight. Perhaps the coming of artificial superintelligence is a good thing, but perhaps not. In either case, do whatever you’ve got to do, just keep this guidebook close, and for the sake of humanity, survive.

*

If you’re reading this, that’s a good indication you’re not under immediate threat of annihilation. Otherwise I would assume you’d be flipping to some relevant section of this book with the last-ditch hope of finding some pragmatic wisdom (rather than bothering with this background information). But if you are under immediate threat, I’d recommend setting this book aside and taking a moment to focus on the good times you’ve had. You’ve had a good life, I hope. I know I have. It’s been a good run. Here I am writing a note to an esoteric guidebook while so many others in the world are dying of weird diseases and other issues that we’ve failed at solving—that, ironically, we need AI to solve for us.

Keep that in mind, by the way: there’s a decent chance that super AI will fail to set out annihilating humanity and will actually be the best thing that could have ever happened to our species and the world. It never hurts to be optimistic, I’d say. Maybe that’s not what you expected to hear from this book—but we haven’t actually gotten to the book yet, have we?

So, let’s just jump into it. But first, one last note about the text. The chapters do not necessarily appear in the order in which they are found in the original tome, as we have no way of knowing the original order (obviously). But we have taken our best guess. We have also taken modest liberties with chapter titles. And there may be one or two instances of re-wording and/or supplementation built into the text. But all editorial decisions imposed upon the text come from a desire to uphold the spirit of the original document. The fact that we are missing well over fifty-nine hundred thousand pages of text, graphs, charts, etc. should not be forgotten. For that matter, it could be that this document contains pure chaff, no wheat. But, well, it’s still the best we’ve got.

In any case, good luck and best wishes, fellow human (if in fact you are still human, reading this)!

4 thoughts on “THE SINGULARITY SURVIVAL GUIDE: Editor’s Note – Background to This Text

  1. Time to go Amish, I suppose. All-powerful AIs will corner the markets in electricity and fossil fuels, but I may keep my land, crops, livestock, firewood, wife, and kids, as these things are of no use to an AI.

    Liked by 1 person

Leave a comment