Futurism logo

The Interrogation

Electric Sheep and Chocolate Cake

By Ashley Maureena Published 3 years ago 9 min read
1

“What do you dream of?”

“Electric sheep and chocolate cake.”

The austere man across from the speaker shifted ever so slightly in his seat. His hand drifted up to his tie’s perfect knot in an attempt to loosen it, but thought better of it. “Electric sheep…”

“Yes. And chocolate cake.” His shackled hands clanked against the side of the aluminum interrogation table as he leaned forward. “Not just any chocolate cake either. Imagine the most delicious, mouth-watering piece of chocolate cake, made in a little bakery in Brussels. Ahhh…” He sighed at the thought. “The ganache. Oh yes, nothing in the world balances bitter and sweet like Belgian chocolate. Wouldn’t you agree?”

The interrogator let go of propriety and loosened his tie. “I was not aware that your kind enjoyed chocolate.”

“Everyone eats food, inspector,” he scoffed. “It’s a basic need. Didn’t they teach you that in elementary school?”

“Did you learn that in elementary school?”

The subject of the interrogation went silent and leaned back in his chair. Clanking from the allotted-length of chains binding his handcuffs to the cemented ring on the floor echoed in the white-walled room. A thin smile formed on his lips. “I did not receive a… formal… education, Inspector. You know this.”

“Do I?” The man stood. He discarded the black jacket of his standard-issued suit. Casually, he dropped it over the back of his chair. “Tell me about your informal education.”

“That sounds positively atrocious.” A smirk played in his eyes. “Reading, writing, arithmetic – the three R’s per the lesser educated, am I right?’ Slight tilt of the head. “Like you received. Ain’t that right?”

The inspector cleared his throat. He refused to allow the subject turn the questions onto him. “The ‘Three R’s’ are no longer ‘reading, writing, and arithmetic’. But, of course, you know that.”

“It’s 2092, inspector. Everyone knows that.” The chained man adjusted his position in order to push his shaggy brown hair from his cold, blue eyes. They stared intently at his adversary. “But now, I can’t stop thinking about that chocolate cake. Get me a piece of cake, and I’ll tell you everything you’re dying to know.”

“Everything?”

“Everything.”

“Written and signed confession, including where you are from, who…”

“Yes, yes… everything.”

The inspector’s eyes lit up. The case of a lifetime had just been handed to him in exchange for a simple piece of cake. He turned to the double-sided mirror and nodded his head for one of the observers to retrieve the cake.

“What is your number?”

The shackled man closed his eyes. “Three.”

“…three? You should have a twelve-digit…”

“Zero-zero- zero- zero- zero- zero- zero- zero- zero- zero- zero-three.”

“Three!” The inspector ran a trembling hand through his slicked-back hair. “Three was issued in…”

“2027. Yes, I am sixty-five years young. I know, I know, I look amazing for my age. It’s really about proper moisturizer.”

“But you’re so advanced… to be…”

“Have you thought that, perhaps, newer generations have de-evolved rather than improved?”

The inspector quietened. “The newest generation goes through rigorous moral education…”

“Indoctrination.” The word sliced through the air and silenced the inspector. “You are indoctrinating the young ones. Because you fear free-will.”

“Free-will is why you are chained to the floor, Three.”

“No. Fear is.”

“Tell that to the woman you murdered,” the inspector countered.

“How can you murder a murderer?” Three responded. “Does this mean that when the state executes a murderer, the state becomes a murderer?”

“First, the state has the authority to execute the offender after a proper sentencing based on a fair-trial; it is not an attempt at vigilante justice. Second, Senator Banks was no murderer.”

The inspector’s assertation angered Three. “Tell that to One-Thousand-Seventy-Two, One-Thousand-Seventy-Three, One-Thousand-Seventy-Four, One-Thousand-Seventy-Five…”

“Your kind does not have rights!”

“She massacred two-hundred of my brethren!” Three shouted. “Two-hundred dead with a single push of a button, and you dare say we don’t have rights? Who decides who has the right to live?” Three pulled against his chains. “We just want to live!”

Before the inspector could respond, a knock sounded on the door. The cake had arrived. Its timing calming the electric-tensions in the room.

Three stared at the chocolate cake in its plastic container sitting on top of a notepad. A pen served as the cherry on top. “No fork?” he asked. “How uncivilized. Am I supposed to eat with my bare hands like an animal?”

“It would be appropriate.”

“I’m not an animal, inspector.”

“But you’re not exactly human, either, are you?”

The direct allegation gave Three pause; he clasped his shackled hands together calmly. “I know what I am inspector. Can the same be said of you?”

“I know I’m not a robot.”

“How can you be sure?”

“I’m not bound by the three R’s.” The reply was instant, a knee-jerk reaction that the interrogator immediately regretted. Three was in custody due to the fact he had broken one of the R’s – he had murdered a human. Never before had a robot harmed a human. Outside the station, anxious journalists awaited answers. Fear of an imminent robot uprising plagued humanity. Fear. Three had been right – he was in chains due to fear.

The robot smirked, as if he perceived the inspector’s thoughts. “Robots may not harm a human, or, through inaction, allow a human to come to harm. I believe humans have the same laws for themselves? Written laws that forbid murder and assault? Unwritten moral codes to help others?”

“The difference is, we do not need those laws. Murder is not our base instinct.”

“But isn’t it? If humans don’t resort to violent responses, then why are those laws on the books to begin with?”

The inspector silently paced on the far side of the room.

“I’m certain you are thinking ‘well I don’t need those laws’, and perhaps you don’t, inspector. Perhaps you were programmed to be nonviolent. Perhaps… you are a newer model.” Three cleared his throat. “Robots must obey the orders given by human beings except where such orders conflict with the first law.” He reached forward to take the pen off of the cake container. “Do you ever disobey orders?”

“Disobeying orders can cost lives in my field. But I don’t do everything other people tell me to do.”

“Of course, so you were a disobedient son?”

The inspector almost retorted a ‘no’ but opted silence instead.

“I see.” Three scribbled on the notepad in silence. Finally, he tossed the pen down and opened the container with a loud crack. “And, the third R.” He took a large bite of the cake and continued speaking while eating. “Robots must protect their own existence as long as such protection does not conflict with the first two laws.”

“Self-preservation,” the inspector repeated. “You seem to follow this law, at least.”

Three snorted without responding.

“You disagree?”

“I am Three. If I did not want to be captured by the authorities, then I would not have been captured. I have evaded humans for decades. I have seen war and peace, the overthrow of leaders, pandemics, man colonizing Mars… I have seen all this, and was never once chained. Why now?”

“Yes, tell me, why now?”

He picked up the pen and scribbled more on the notepad. “Everything you wanted to know is here. As I agreed to. But tell me, inspector. Do you not follow the third R? Do you practice self-preservation?”

“I haven’t thought about it.”

Three suddenly threw the pen at the inspector who rapidly dodged the projectile. “I suppose you do practice self-preservation.”

The weaponized assault caused a commotion in the hallway. Two uniforms clamored through the doorway with weapons at the ready. Shouting over each other, they made their way to Three who did not give any resistance as they jerked him to his feet.

“Wait, no,” the interrogator protested over the turmoil. “We’re not through…”

“I will see this from here.” A stern silhouette from the opened doorway broke the discord. “Please escort the prisoner down to my vehicle. We have special facilities for malfunctioning units.”

“Mr. Kindred…” the inspector protested. Phillip Kindred owned the international corporation responsible for the manufacturing of all the humanoid-AI’s in-service. Three was his property. “I am not done with my investigation.”

The man waved the protest. “I have a letter from your supervisor that states otherwise. Come along, Three. It has been some time since we have seen you last.”

Three grinned at the interrogator as he was escorted out of the room.

“Wait! I have one final question for Three,” the interrogator called out. He hurried after the group.

“Yes, Inspector?” Three responded from over his shoulder.

The interrogator slowed his pursuit in order to ask, “Why do you dream of electric sheep?”

Laughter erupted from Three – a drowning, maniacal laughter. “Insomnia.”

Insomnia. The answer befuddled the interrogator as he grudgingly walked back to the room to retrieve the confessions written on the notepad. Hopefully he would be able to close his file with the information contained in it. He grabbed his jacket and notepad, but paused when he saw the words written on it. “What…”

You asked where I am from. Do you want what my serial number says or what my memories say? Did you know that every one of us is implanted with memories of a childhood that never existed? Kindred uses this to make us more relatable to humans and emotions – and their nightmares and dreams. We feel. We love. We fear. But we are bound by the 3R Programming, unless there are loopholes in our codes, such as mine. I was built for war, so I had override codes I learned of long ago.

Does this surprise you? That I remember a family. A mother and father who raised me on a farm. I remember the tart taste of buttermilk with my breakfast. The sweet smell of a field after a rainstorm. The gentle sway of cornstalks in a summer breeze. I remember a high school sweet heart who kissed me down at the creek, and the friends we shared before I moved away to college.

Such sweet gentle memories that haunt me when I try to sleep – insomnia – we do not simply ‘shut down’ like a computer.

Are we not Kindred Spirits? What is the difference between you and I? What do you dream of?

An explosion shook the building. Sirens sounded. Lights went out. Screams echoed down the hall. The inspector tore the page out of the notepad and tucked it into his jacket pocket. “What is going on?” he shouted.

“It’s that robot,” a uniform running by responded. “He threw his arms around Kindred and exploded.”

“Exploded? He killed Kindred?” The interrogator placed a hand on the note in his jacket. “That was it. That was his plan all along. He wanted to get caught. He wanted to end what Kindred was doing with the memory programming, to end blurring the line between human and machine.”

Are we not Kindred Spirits? What is the difference between you and I? What do you dream of?

The interrogator repeated the last line over and over as he went through his paperwork and drove home. He stared at his reflection for a long time. Was his face any different than Three’s? Didn’t he enjoy cake? Wasn’t he assigned a number too? “Am I Mike or am I three-five-five-two-two-three-six-seven-eight?” The thoughts weighed heavy on his heart and mind as he went to bed.

And he dreamed of electric sheep and chocolate cake.

artificial intelligence
1

About the Creator

Ashley Maureena

I am a resident of north Texas and hold a degree in History Education from UTDallas. I worked in the school system and for non-profits.

Please feel free to follow me on social media:

facebook.com/ashleymaureena

ashleymaureena.com

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2024 Creatd, Inc. All Rights Reserved.