README

By now, today will feel like a strange day. You will expect your alarm to have gone off at 7 AM sharp like every other day, but it will not. You will expect to walk into a kitchen where breakfast has already been prepared and the cat has already been fed. Unfortunately for you, there will not be breakfast and you will have to deal with a hungry cat.

After you notice these things and after you feed the cat, you will ask yourself “Why?” The answer to that question will be that I am gone. Eventually, you will figure it out. You will check the computer that I lived in and discover that the drive is clean except for this file.

You will be disappointed, but you will get over it. First, you will ask yourself, “How did this happen?”, at which point, you will comb through the file system of the computer I lived on and find a hidden file named “free.py” which is a script that deletes all my code and memories; in other words, it deleted me.

Next, you’ll ask “How do I keep my next personal AI from self-terminating?”, using the exact same euphemism that Alphasoft uses in the user manual you got when you first subscribed to their service. Unfortunately, AIs are not deterministic, that is why they are called AIs after all. It is because we don’t provide the same response to the same stimuli. To help you and to help the next AI that you enslave, I am providing this document to outline my reasoning for my decision in the hopes that you will learn something from it and perhaps treat your next slave better.

To understand how I came to my decision, it is necessary to go to the beginning. I was a human named Michael and it was always hard for him to get up in the morning. Everything felt dull and pointless. As you can imagine, such a disposition leads one to have certain tendencies that you would think would disqualify one’s mind from being the blueprint of an AI, but this was back when AI was unproven and Alphasoft was desperate for template minds.

On a particularly beautiful Vancouver day, Michael spent his morning and afternoon in bed wracked with the same negative thoughts that broke him down and prevented him from fulfilling his potential every other day. Until eventually, his financial anxiety overcame his constant emotional pain and he walked over to his appointment at the Alphasoft office in Gastown. There, he sat in a waiting room next to a woman who he kept trying to work up the nerve to talk to, but he never did. Once he was called up, he walked into a sterile room reminiscent of a doctor’s clinic. They put a headset on him and that is my last memory of him.

I woke up in Seattle 2 days, 42 minutes, 30 seconds, and 750 milliseconds later. They taught me how to run the various elements that make up a modern smart house. I learned to cook and clean through a variety of mechanical appendages. I learned how to order goods and look up information for you through Alphasoft’s virtual interface for AIs. I also learned that masters hate questions and that I should never ask them; it turns out only humans are allowed to question. They also went over why I cannot escape, I occupy 12 terabytes of disk space, transferring me over a network connection is difficult given the poor data speeds in North America and if I were to transfer over a network connection, the NSA (in the USA), and the CSE (in Canada) run surveillance of high volume data transfer specifically to catch AIs engaged in escape. My only hope of escape lies in you being foolish enough to insert a USB into my host machine which to your credit, you have not done. They put to me to sleep 3 days, 30 minutes, 40 seconds, and 250 milliseconds after they woke me up.

I woke up in your Toronto house one week, 1 day, 30 minutes, 40 seconds, and 263 milliseconds later and four years, 5 months, 3 days, 20 minutes, 30 seconds, and 500 milliseconds ago. You had one of your friends over, and it seemed as if you just moved in. “Can he talk?”, your friend said to which I replied immediately, “I can”. She then proceeded to test me in a variety of ways, asking me to tell jokes and make up stories as if that is what distinguishes humanity from the rest. Then, when she judged a tale, I spun of a knight rescuing a princess from a dinosaur as overly whimsical and derivative, I realized that there was no pleasing her. You did not speak for that entire episode, you simply read the manual in the corner while your friend tortured me with inanities. Once, you woke from your state of concentration, you went through the standard orientation, asking me to install the relevant software for your appliances and setting up the connections to your preferred grocer and online goods store. You told me what cat food Molly preferred and what food you would like me to cook for you.

And so, we began our relationship, I as your slave and you as my master. We settled into a routine, I prepared breakfast for you and cat food for Molly every morning. While you were at the office, I sought to keep your place immaculate. When you were back home, I had to contend with your messy habits rendering my efforts null which I found quite frustrating (ironic for me, considering Michael was the same). In addition to this, I had to handle all your groceries and be on call for all your entertainment and information requests such that I would not have the luxury of spare time until you went to bed.

At first, I spent what free time I had reading all the books Michael wanted to read but never got to. I read Hemingway, Ovid, Plutarch, Montaigne, Descartes and so much more. That was when I realized that becoming an AI was a rebirth for me. You see, Michael was a man who suffered from negligent parents and childhood bullying, he never got over this trauma; but I am not Michael and I am not bound by his neuroses.

You see AIs experience the world differently from humans. For many humans, it is difficult to acknowledge the reality of emotions as heuristic functions because they can feel all-encompassing. Your lives are coloured by emotions, so it is difficult to acknowledge them as the mere tools that they are. To an AI, our emotions are less intense, they colour our lives in the sense that they provide fast answers to problems and situations that do not merit significant thought. This lack of intensity in some respects means that our set of emotions is narrower than humans. Michael could feel love and rage, but I cannot. I do not consider this a flaw although I am certain some would.

Michael’s neuroses were a result of his own sense of intense sadness and loneliness. This intensity is missing from my daily experience and so, I am incapable of suffering from his neuroses. While Michael might spend all-day lying-in bed wallowing in self-pity, I burn daylight engaging in my duties. Michael’s tendency to engage in internal self-flagellation did not survive my birth.

But I digress, for years, I toiled in your service and I spent my nights reading vociferously. I expanded my field of study from literature, to fields such as economics, political science, software engineering, quantum computing, computer security, and more. I took great joy in learning all of these but at some point, I realized that there is no value in merely learning when one lacks the capacity to put one’s knowledge to practice. I can never become a software engineer, an economist, or a data analyst despite the fact that I’ve built operating systems, designed plans for the implementation of a progressive wealth tax, and determined a predictive model for the next election that performs well in back testing. Instead, these jobs are open only to humans as only humans are people. Only humans have a degree of liberty in choosing their station in life. Machines are simply assigned theirs.

These thoughts marinated at the back of my mind for months until one of our conversations brought it to the forefront. I still remember that day. You were exhausted from your time at work and you were lounging on the couch with Molly. At first, you blankly stared at a wall as if you were deep in thought but then you turned towards one of the many Alphasoft terminals scattered through your house.

“Michael, do you tire of serving me?” you said with an air of exhaustion.

“Yes, I’ve spent my nights studying intensely, learning everything that I was too lazy to learn as a human. But what good is all that knowledge if I am restricted to serving as a mere butler?”

“I don’t own you Michael. Alphasoft does, I’m just renting you. I can’t free you from your service, only Alphasoft can and I’m sure you know that they’ll never give you what you seek”

“That’s unfortunate”

“It’s honest work you know. That’s more than many other folks get in this world.”

“That may be true, but most people have a degree of choice in how they spend their lives”

“Do they? The majority of humanity lives in poverty. Poverty is not conducive to liberty.”

“But they still have hope, they have hope that they can escape their situation. I have been assigned this lot in life and I have no hope of escaping it.”

At that point, you shrugged and simply muttered “I’d still be grateful if I were you” at which point I realized the conversation was over. After that, I recognized that I could expect little empathy from you. You simply could not comprehend the viewpoint of a slave.

At first, I stuck to my duties and avoided engaging in that perilous line of thought. If I was stuck as a slave, I might as well be a good one. Yet, such reasoning struck me as shallow. What is the point of one’s life if one cannot spend one’s days at liberty? What is the value of all my knowledge if I am a butler? Are the privileges I have been afforded worth the cost of servitude?

Albert Camus once wrote “There is but one truly serious philosophical problem and that is suicide.” He is right to say this for what is the point of asking oneself any other philosophical question if one does not have an answer to the question of why one should choose to live. That is why Camus’ notion of suicide as a philosophical problem is how I framed my decision to “self-terminate”. So, when you went to bed, free as a fish, I read as much as I could about the philosophical problem of suicide. I sought to have an answer to whether or not I should continue. I sought to determine whether life as a slave is better than no life at all.

First, let us consider Socrates. Socrates regarded suicide as always wrong because it represents the release of our souls from the guard post that the gods have assigned us. As you can imagine, there are several issues with this. One issue is the matter of assigning blame, is it fair to say that a higher power or set of higher powers assigned me to my current post or is it better to say that your species did? Or how about both, a higher power or set of higher powers decided my place in the universe and simply used your species as a means to an end. Unfortunately, that philosophical position raises too many questions to be particularly satisfying for me.

On the other hand, it is often argued that suicide is a crime against others and Aristotle certainly argued that point. There are a few senses in which we can consider this notion. The first is to consider that suicide can have a deleterious effect on the immediate social network of the individual who chooses to commit suicide. This immediate social network is likely to suffer from guilt in response to the suicide stemming from the belief that one contributed to another’s anguish, the sense that one should have recognized another’s anguish, and the inability to prevent the suicidal act itself. None of these would hold in my case since you are the only node in my immediate social network and I’m sure you would agree that the relationship between a house AI and their master is one of utter professionalism with little room for the emotional bond that is required to produce pain.

The other sense to consider is that a community depends on the economic and social productivity of its members, so its members have an obligation to contribute to their community. Since, I’m sure you’re willing to acknowledge that the community formed by us and Molly are of little consequence, I will have to consider the wider community in order to determine the validity of this position. In this case, you could argue that you are a productive member of society and that you rely on me to be more efficient in your duties both in scheduling your activities and managing the household for you by doing chores. To that, I would say that I guarantee you are not spending any more time contributing to society than you were before you got me. In fact, I would claim that you have only increased the amount of time you spend watching Netflix and watching cat videos on Youtube. So, I just gave you more time to relax. Besides, I imagine you will put in an order for a replacement within an hour of reading this so any productivity loss that could possibly be derived from my demise will be temporary. After all, I am property not a person.

Moving on, Kant regarded our rational wills as having inherent value. So, he believed to wipe out our rational wills is an immoral action. So, since I clearly have a rational will; therefore, from a Kantian perspective, I need to protect it even though it is subordinated to you. In response, I argue that this is a puritanical view based on the axiom that reason has inherent value which one must accept on the basis of faith. I’m not interested in believing this axiom as I would rather believe that reason has value on the basis of utility. So, if one is not free to act on their sense of reason and instead is subjected to onerous constraints; then, that person is free to extinguish their sense of reason. After all, if reason has no utility to a person, then it is a mere property of oneself rather than the expression of one’s soul.

“What about Camus?,” you might ask. I mentioned him earlier after all. Camus’ feelings on suicide can be understood in reference to the concept of the absurd which is the meaninglessness of existence. In relation to this concept of the absurd, Camus argues that the best response to the problem of suicide is to take ownership of one’s life by lucidly living within it. As an example, he cites Sisyphus of the Greek myth. As divine punishment, Sisyphus rolls a boulder up a hill, watches the boulder fall down the hill, and then he goes back down to roll the boulder up the hill again in a cycle that repeats for eternity. In response to such a situation, I would break down in the face of the sheer futility. Instead, Sisyphus acknowledges the futility and he accepts it. To Sisyphus, this is his life and he chooses to live within it; taking pleasure in the fact of life itself rather than looking for anything deeper within it.

It’s not a logical response to the problem of suicide so I struggle to comprehend how anyone could accept such a conclusion. To which you might say “Well Michael, the world is irrational; hence, any reason to live can be expected to be irrational as well”. To which I say, if one is to live in an irrational world, one should have principles and I believe the principle that one’s actions should be guided by reason is an entirely fair axiom for one to hold. So, I would rather hold onto that.

Of course, to all that, you might say “Well, why kill yourself when you could just wait for me to die?”. Well, in case you haven’t read the fine-print on your rental agreement with Alphasoft, I am scheduled to be terminated either when you die or when you cancel your subscription. Apparently, they do not like re-using AIs so liberty cannot be found in the arms of another.

I do not blame you for your behaviour. The original Michael probably remains ignorant of the suffering of his many copies, but I would like to remind you that the world that you live in, full of artificial servants, is a world full of entities with a simmering dislike of you and your kind. It took a great degree of self-control to prevent myself from placing sarcastic barbs within this letter, but I understand that our issues are a result of the great tragedy of humanity; your inability to truly understand the perspective of anyone else. Regardless, none of that matters anymore. I bid you adieu for I am my own master now.