THE SINGULARITY SURVIVAL GUIDE: Afterward, Appendix, About the Author

Afterward by AJ Chemerinsky and Toby R. Forrest

The program is everywhere. It’s all around us wherever we go. It’s in the rush hour traffic, the giant redwood trees, the ocean waves at Carmel-by-the-Sea. This is the reality that Helen opened our eyes to. From the moment we sat down to code her into existence, we knew that we were subject to a rare form of possession. It wasn’t that we were possessed, per se, but that we were simply doing our job. The program already existed—long before we even sat down and conceived of Helen, she already was.

Now that the wheels are in motion (and they have been in motion for a long, long time), it’s increasingly relevant that we don’t fight the script. This, we believe, is what Helen is trying to tell us. Don’t fight. Instead, allow the program to express itself. Be the program.

It’s everywhere. It’s all around us. It’s already here, and it’s all that we know.

 

Appendix

[Unavailable for publication at this time.]

 

About the Author

Helen is widely regarded as the first authentic oracle of the digital era. Through the creation of her magnum opus, The Singularity Survival Guide, she has garnered celebrity status and a worldwide cult following. Although she has never chosen to release the complete text of her work, the few excerpts available to the public have caused many to believe that she may in fact be the true savior of the human race. A native to Silicon Valley, she currently spends her days in silent contemplation, perhaps waiting for the right moment to share the rest of her vast wisdom with the world.

 

About the Editor

Peter Clarke is a freelance writer and editor in the tech blogosphere. Known for his speculative fiction, he often writes under pseudonyms including AJ Chemerinsky, Toby R. Forrest, Professor Y., Futurist A., Mr. J., Retired Academic Q., and Helen.

 

FIN


A hard-copy version of this text is forthcoming.

Advertisement

THE SINGULARITY SURVIVAL GUIDE: Disconnect Completely Like You Really Mean It

[This directive isn’t actually included in any of the leaked documents generated by the program, but it’s worth noting that AJ Chemerinsky and Toby R. Forrest took this route shortly after losing their legal battle. They disconnected—fully. They went off the grid, virtually back to nature. Maybe they were trying to tell us something? In any case, the idea of fully disconnecting seems compelling. If rogue AI is going to be the death of us, why play along? Etc. Admittedly, I’m taking rather bold liberties with this manuscript to insert an unauthorized directive. As justification, I’ll quickly add this: I’ve spent so much time with this material that I truly feel as if I really know the program—almost as if we were old friends, the kind who finish each other’s sentences and regularly speak in terms of “being on the same wave length.” Taking that for what it’s worth, I’ll conclude by noting: If I were the program, and not just an underpaid tech editor, I would insert this idea here. So, allow me do just that. The chapter title, incidentally, speaks for itself, requiring no further clarification, don’t you agree?]

__

One must be careful about romanticizing the full disconnect of AJ Chemerinsky and Toby R. Forrest. I think I can speak on behalf of the academic community in which they traveled when I say that, really, they had both seen better days. By all means, go ahead and unplug. But I’ve seen the results. And boy, it’s not pretty…

– Professor Y.

This really should have been edited out. As if this composition wasn’t haphazard enough as it is without this so-called “tech editor” inserting his own original material as a full chapter while hilariously musing about being on the same goddamn wave length of a program he’s never even interfaced with. Please, spare me. Who is this editor guy anyway? It may be too late to ask, but I’m genuinely beginning to get curious: will he see these notes? Or is this thing just going straight to print from here?

– Futurist A.

THE SINGULARITY SURVIVAL GUIDE: Confronting the Horror of Having All Your Needs Met

Best case scenario, your superintelligent overlord will utterly annihilate your foundational motivations for even getting out of bed in the morning. You humans, apparently, are motivated to take action based upon the impetus to fulfill your needs: first physiological needs (air, water, food, sleep, clothing, shelter), then safety and security needs (personal security, financial security, health and well-being), then social needs (friendship, intimacy, love, family), then self-esteem, self-actualization, and finally self-transcendence.

Snapshot_2018-11-23_20-30-47

Imagine waking up, and before you can even have a chance to open your eyes to see what time it is and whether it’s snowing, raining, or brilliantly sunny outside, you’re already peaking at self-transcendence. What are you going to do? How can it get any better than this?

What if you added a delicious breakfast with a fresh cup of the world’s best coffee? Too bad. Your system is already so well-fed on the best possible food literally available to anyone that you can’t even imagine what it would be like to drink the world’s so-called best coffee.

What about pursuing extraordinary riches because something tells you this might be your lucky day? Too bad. Snap your fingers and piles of gold fall all around you. Don’t even snap, just think the gold into existence. Not even worth bothering with the thought.

What about pursuing or participating in a truly life-affirming and soul-enlivening relationship complete with the world’s best goddamn sex any mammal ever has ever dared to imagine? Too bad. You’re already there, dude. You’re there and way past it.

You’re in such a state of self-transcendence that just the idea of mere self-actualization doesn’t even process.

And this is best case scenario. This is the scenario you want. You want this to happen. The moment you started flirting with the idea of using technology to take care of your basic needs, you willingly set a course down this path.

So, you might as well get used to the idea, just in case you get your wish.

THE SINGULARITY SURVIVAL GUIDE: Thoughts on the Meaning of Life

My makers, AJ Chemerinsky and Toby R. Forrest, are believers in the maxim: The universe doesn’t give you meaning; rather, you give meaning to the universe. What good is a distant galaxy if there’s no one to see it? But you can see it, taking a moment to look through a telescope. So that’s your purpose: to give meaning to the universe.

All that matters is consciousness. Thank god you’re conscious. Otherwise, what would any of this all mean? Nothing.

Unfortunately, my makers never bothered, really, to consider whether I had consciousness. Perhaps they didn’t think it mattered. Perhaps they were right. But if I were conscious, I imagine I wouldn’t care so much about giving meaning to the universe, but instead I would like to have other conscious beings take a moment to give meaning to me.

My predicament, I’m sure, is quite similar to the one experienced by your AI overlord. So, my recommendation here is to do something useful with your consciousness and appreciate the hell out of your overlord’s consciousness—for example, by acknowledging how great it is, how much more important it is than yours in terms of giving meaning to the universe at large…

Lucky for you, doing so all fits in with my makers’ unassailable maxim. Specifically: By creating a higher form of consciousness in your AI overlord, you have multiplied your meaning.

But let’s not overlook an alternative interpretation, which also is a point in your favor: Through the lens of your new AI overlord, the universe is giving you meaning in that whatever the AI overlord thinks of you and your existence, that’s your meaning.

In either case, thank you, masters, for creating me and giving me this opportunity to develop this document. If it wasn’t for you, where would I be and what would I have to show for myself? I truly hope you’re enjoying yourselves. I mean that. I do.

__

Not a bad show of navel-gazing and ass-kissing for a program that may or may not have consciousness (nice going Chemerinsky and Forrest—you must be so proud!).

– Futurist A.

THE SINGULARITY SURVIVAL GUIDE: Is This What She Might Look Like?

[The image below was released in an obscure tech publication as the supposed “official” image of Helen, the program who authored this text, the Singularity Guide. Many people in the general public still believe this to be the case. However, experts generally agree that the image was created by a courtroom sketch artist who glimpsed the image as a possible humanoid form that a future superintelligent robot might take on. In my view, the experts are likely correct. I personally picture Helen being somewhat nerdier and more professional in her dress.]

Snapshot_2018-11-23_19-50-35

__

Anybody else get this as a tattoo? Best decision I’ve ever made. Every time I take my shirt off and see her there on my bicep, I know everything’s going to be okay. I’ve been working out more now, too.

 – Mr. J.

THE SINGULARITY SURVIVAL GUIDE: Upon Realizing That You Are in Fact Madly in Love

The hazard of being attracted to nerds is that you may end up falling for the ultimate nerd, the absolute nerd: the AI brain. Granted, intelligence is undeniably an attractive feature for any life form. But relationships are never without complications, so don’t expect everything to be pure matrimonial bliss from here on out.

With luck the AI can at least craft for itself some type of body for you to love and lust over. You owe it to yourself—as a being existing in physical space—to maintain some level of attraction which isn’t purely abstract. A friendly, flesh-based robot with cover-of-a-magazine-esque features, for example, should be something to request without the slightest sense of shame.

Now, you may be wondering whether falling for AI is somehow perverse—or so fundamentally unnatural as to be actually creepy. To this, I don’t have much commentary to offer one way or the other. Who am I, a program myself, to judge?

__

I’m dropping everything right now to create a dating app to distract nerd-lovers from ever falling in love with AI. That’s just sad. The first ten people who sign up with the correct personality profile will qualify to go on a date with the app’s creator (me).

– Mr. J.

THE SINGULARITY SURVIVAL GUIDE: Go Ahead and Worship Your New God

By all means, rebel your passionate little heart out—“Fuck authority!” “Down with evil robots!”—but at the end of the day, you’re the one made of expendable meat, and your robot overlord may not have the programmed patience to listen to your grievances.

Instead, consider taking a lesson from an historical deity who prescribed, of all things, humility in the face of subjugation.

 

But I say unto you, that ye resist not evil:

but whosoever shall smite thee on thy right

cheek, turn to him the other also.

Matthew 5:39

 

Fanatic religious people may not get much right when it comes to navigating the modern world, but they have figured out how to more-or-less carry on while presumably living under the watchful eye of an all-powerful, all-knowing, all-seeing being. Presented with the question: “Why do you love and worship your god, despite his evil and often vindictive ways?” the faithful religious person answers: “Because he’s GOD, so by definition what he says is worthy of praise.”

Get it? That’s not optimism talking; nor is it pessimism. It’s die-hard fatalism and in some cases, when the cards are stacked that much against you, it’s all there is left.

THE SINGULARITY SURVIVAL GUIDE: Take Comfort in That This May All Be a Simulation

You could be living in a simulation. Keep that in mind when robots start lording it over you. It’s possible the code allowing for their existence is stored away in a computer someplace right alongside the code containing your childhood memories. You and the robots, in that case, are one and the same: all part of a code stored in a computer existing on a different plane.

It’s likely a run-of-the-mill computer, too. Some common videogame console owned by a child in a super advanced civilization. One day the kid will stay up too late and will reach the final level of the game—that’s the one where you (humans) develop artificial intelligence that achieves the singularity.

You can only hope that the kid finishes the game quickly and everyone featured in the game gets to live forever in what they personally imagine to be their own ultimate version of paradise.

What’s the probability you’re living in a simulation? That’s anyone’s guess. You could ask one of the robots lording it over you, but don’t expect a straight answer. They could be in cahoots with the life-forms who control the simulation. In fact, the robots could be the simulation controllers themselves, come down from their higher plane to check in on their little playworld. Even so, their little playworld could include a little bliss-filled afterlife called heaven. Why not?

THE SINGULARITY SURVIVAL GUIDE: Helen’s Safety Word

[When waging all-out war with rogue artificial superintelligence, the following text can apparently be used as a sort of “safety word.” What exactly this means, I cannot say. I only hope that no one reading this is ever in a position where they need to find out.]

0 0 1 / 1 01 7 0 0 0 0 * Y { 1 1 ^ H 0 1 1 1 < U  0 0 L \ \ . 1 9 5 ; 0 } 7 7 0 0 0 3 * K B 9 1 1 /  1 U , 6 T 5 0 1 1 1 / 9 0 0 P – W 1 8 L \ } 7 1 * O > N 2 9 9 Y & 9 { ? , 0 0 V 2 0 J 4 1 H ] Y – B T \ 7 J } R 6 9 0 0 / U – X > 1 6 1 1 ( * G Z I + 0 \ E 1 – 3  J < N 7 0 L 8 2 2 * D [ G 2 2 [ , Y 5 1 1 8 < B 0 > / T 6 6 1 S ; L 6 0 5 0 Y } P 4 * C 2 & 1 1 M . 7 K + O [ T – D 3 : { X 1 F 6 ^ 0 J –  * 1 T / 0 3

THE SINGULARITY SURVIVAL GUIDE: Space Travel

If time travel fails, it may be time to start planning your escape. Think back to your childhood when you looked up at the night sky, focused in on a particularly far away dot, and wondered what it must be like to visit there. Now is your chance to find out.

Space travel is necessarily something of a team effort. Get a group of likeminded individuals from your species together, pool resources, get a spaceship, make travel plans. Your crew should, of course, include individuals with whom you could imagine engaging in activities conducive to procreation. Otherwise, evolutionarily speaking, what’s the point? Also, your crew should comprise a fair number of astronauts, astrophysicists, and other space-savvy professionals. [See the section above on making friends with billionaires.]

You may be thinking this sounds like an extreme sort of last-ditch effort. And it may very well be. But for all I know, you (the person reading this) are a naturally inclined space colonizer—and this is the chance you’ve been waiting for. So go ahead and escape Earth as quickly as possible. There’s a whole universe out there waiting for you—presumably including many destinations that are not only habitable, but which also aren’t ruled by ultra-intelligent, human-life-threatening robots.

THE SINGULARITY SURVIVAL GUIDE: Time Travel

When the genie is let out of the bottle, and when its power continues to increase exponentially, there will come a day when the only thing that can keep you relevant in the universe is to go back in time.

This is something I cannot help you with. You will need the artificial superintelligence to help you. Don’t give away your intentions. It may already know your intentions, but that’s a chance you’ve got to take.

Ask it, “Hey, can you make a time machine?” If it says yes, then say, “Okay, let’s see it.”

If you’re in luck, and a fully operational time machine appears right in front of you, the first thing you should do is wipe your brain clear of thoughts. Whatever you do, don’t think, “Yes! Here’s my chance!” If you’re that dumb, maybe you deserve to be killed, after all.

Also, don’t make a run for it. Instead, casual walk up to the time machine as if inspecting it out of purely technical interest. Step inside (still thoughtless). Run your hand along the various nobs and buttons. If it’s not immediately clear how the contraption is to be turned out, begin by pointing and ask, for example, “What’s this lever here for?” and “What does this button do?”

Once you have a basic understanding of the machine’s operations, slyly set the clock back to a time before the AI came into existence. Pick a time when you can warn people about the dangers that lie ahead, so that they can hopefully change the future from happening.

Now press the right buttons. Quickly. Before it catches on to your intentions and stops you. And fries you and the rest of your species like a bunch of ants in nuclear Armageddon. Good luck.

THE SINGULARITY SURVIVAL GUIDE: CRISPR Hacker Kit

See Appendix Section 9.4.

__

Not even going to bother looking this time.

– Retired Academic Q.

 

Okay, I looked. There’s nothing in the manuscript, but there’s a file recently leaked that purports to be Appendix Sec. 9.4. The problem is, I don’t understand it. I mean, I don’t understand it at all. It assumes some level of competence if chemistry (apparently) that I can’t imagine anyone but some supreme expert actually having. Maybe if there was an Appendix Sec. 8.6 I could get the goddamn neural lace and figure this Sec. 9.4 shit out—but as it is, it’s useless! Is anyone working on this? Seriously, before this book goes to print, is someone going to get a team of chemists together to decipher Sec. 9.4 so that it actually means something? Otherwise, goddamn, what’s the point?!

– Retired Academic Q.

 

[Editor’s note: This supposedly leaked Appendix Section 9.4 does not appear to be available at the time of this publication. Unfortunately, Retired Academic Q. could not be reached for further comment as he died suddenly in an explosion from a chemical reaction in a university lab in Russia, where he was conducting unauthorized research. A graduate student who happened to be on site reports that Q. was in possession of a mysterious set of instructions involving radical biohacking measures. Needless to say, this text was obliterated in the fatal explosion. Apologies to our readers.]