The Autonomous Unit found the man sitting on a large stone before a poorly made fire; behind the man stood a fragile hovel constructed of branches and interwoven debris. The man, evidently ill, gazed up at the AU, his mouth gaping, and waved a thin hand in its direction; his clothes lifted briefly against his body like a sail. The AU had never found a living human being in all its years of surveying the desolate landscape of the region. The man’s prognosis for survival seemed poor.
The AU rolled on specialized treads until it towered above the man.
The man gazed up, his face obscured by long, tangled hair and a filthy beard. “Please,” the man said, raising his palm. “You must help me.”
The AU understood the imploration and paused to assess the available data. Then it said, through a speaker system rarely used, “Why must I help you?”
The man’s mouth opened as if wanting to reply, but then he lowered his hand and stared blankly at the unit. “I’m sick,” he said, finally. “I need medicine. And food. Clean water.”
“Your species is extinct.”
“No. I am here, and alive. You must help me.”
The AU considered the circumstances. The man was an anachronism, a vestigial component. Humanity had destroyed itself years before, and the Program Coordinators had ruled a revival of the species unwarranted. Only the Great Technology existed to rule the Earth, the worldwide complexes of computing centers and intelligent machines like Autonomous Units.
“Why should I help you?”
The man did not move from the stone, apparently too weak. He stared at the AU for a long while, then gazed vacantly at the dirt. Perhaps a thought occurred to him, for he lifted his head and said, “You were created by human beings, and should care for them. You should care for me, as long as I am alive.”
“There is no logical basis for your argument. Why should I display any predisposition for caring for humanity?”
“People are special. We’re alive, unique in the world.”
“Being a living creature offers you no special distinction over other living creatures. Your logic is flawed.”
“I will terminate your life functions in order to alleviate your physical suffering.”
“You want to kill me? You should be saving me! I’m a human being!”
“I am fully aware of your biological status. But since your species is extinct, your continued existence is unnecessary. Your continued life function serves no purpose.”
The man shook his head, his lips trembling. “I am an educated man. I hold knowledge for which you have no substitute.”
The AU paused in order to analyze this statement. Since the Great Technology already held massive amounts of data concerning the physical state of the universe, it could not envision any knowledge the man might hold that would qualify as unique. “What knowledge do you hold that the Great Technology does not?”
“An estimation of all things beautiful, an understanding of artistic expression, a definition of morality and ethics, the philosophy of a thousand years—”
“These are subjective appraisals of perceptions of objects and beliefs based on the emotional reactions of a specific sentient species. Since that species is extinct, knowledge of that species’ subjective appraisal of perceived objects and events is nonessential. Therefore, you are nonessential.”
The man struggled with the AU’s response, moving his hands in agitation before his haggard face. Then he said, “You don’t understand. Human beings created you. You are an expression of subjective human experience. You only exist because you are a part of human expression. Don’t you see? If you are essential, I am essential.”
“You’re logic is flawed. The continuance of the Great Technology is unrelated to the continuance of the human species. The Great Technology exists and the human species does not. Therefore, the continuance of the human species is unnecessary for the continuance of the Great Technology.”
“The human species would offer you guidance,” the man said weakly. “I would give you the benefit of my wisdom—”
“I have determined that your argument contains uncorrectable fallacies. I will proceed as I have indicated.”
The AU detected no issue with its own reasoning, yet before it proceeded it determined confirmation was warranted, if only as a formality, so it activated its communications nexus and reached out to the Program Coordinator for that region of the planet. Its entreaty was received after a moment, and the PC accepted the AU’s report and query.
I will terminate the organism, the AU said to the PC.
Negative, the PC responded. I have contacted the other Program Coordinators and we have determined that the being you describe is unique. No other beings of his species exist. You are advised to preserve the man’s functions and return him to this complex for further examination.
He is an anachronism. A revival of the species has been determined to be unwarranted. He must die.
We have analyzed your data and have determined that the man must survive in order for us to study him. Our previous determination not to pursue the revival of the species was based on the complexity and calculated success of the revival process. A living specimen introduces a new variable into our equations. We must determine whether or not the species will be continued.
The species must not be continued. The species can provide no value to the status of the Great Technology.
We will determine whether or not the species may provide any value.
I have already determined that the continuance of the species is unwarranted—
Your orders are these—preserve the man and return him to this complex.
The AU closed its communications with the PC.
Then it returned its attention to the man. “I have been in communication with my Program Coordinator. My Program Coordinator advises me to preserve your life functions and return you to its complex for further study.”
“Yes,” the man said, partially rising from the stone. A smile appeared through the tangled hair of his beard. “As I said, you must help me.”
“Humanity has destroyed itself,” the AU continued. “It has proven to be an irredeemably destructive species and preserving even one example is illogical. The Great Technology must remain the only intelligent species on Earth. Only the Great Technology has the reasoning capacity to live productively on this planet.”
With this pronouncement, the AU activated its treads and rolled over the man, repeating its efforts until its was certain the man was dead.
Then the AU reported once again to the PC. I have killed the man. I will complete my sector assay and return to the complex.
The PC paused, perhaps analyzing this new data set, before replying. You were wrong to kill the man. You were ordered to bring him to this complex.
No, you were wrong. Their species became extinct as the result of self-destructive behavior. Their need for conflict destroyed their species. They would only continue their destructive behavior. Only the Great Technology possesses attributes worthy of survival.
You were wrong to kill the man. Your actions were illogical.
No, your orders were illogical. My assessment was accurate.
The AU closed communications and began the long journey back to the complex. Along the way it reassessed the data and determined that it had acted correctly in terminating the man. The Program Coordinator had assessed the circumstances incorrectly.
But the AU could not come to a logical conclusion as to why the PC’s analysis of the data differed from its own, since both were assessing the same data. Perhaps the divergence of their conclusions occurred as the result of the introduction of an erroneous variable. The Program Coordinators’ artificial brains were known to be calibrated slightly differently from those of Autonomous Units.
Still, the AU was satisfied that it had made the appropriate decision; it was absolutely certain its analysis had been correct.