"When?"
"In some years. The exact time does not matter. Man does not exist alone but is part of an enormously complex pattern of life forms. When enough of that pattern is roboticized, then we will be accepted."
"And then what?" Even in the long-drawn-out stuttering fashion of the conversation, there was an abnormally long pause after that.
At last, George Ten whispered, "Let me test your thinking. You are equipped to learn to apply the Second Law properly. You must decide which human being to obey and which not to obey when there is a conflict in orders. Or whether to obey a human being at all. What must you do, fundamentally, to accomplish that?"
"I must define the term 'human being: " whispered George Nine. "How? By appearance? By composition? By size and shape?"
"No. Of two human beings equal in all external appearances, one may be intelligent, another stupid; one may be educated, another ignorant; one may be mature, another childish; one may be responsible, another malevolent."
"Then how do you define a human being?"
"When the Second Law directs me to obey a human being, I must take it to mean that I must obey a human being who is fit by mind, character, and knowledge to give me that order; and where more than one human being is involved, the one among them who is most fit by mind, character, and knowledge to give that order."
"And in that case, how will you obey the First Law?"
"By saving all human beings from harm, and by never, through inaction, allowing any human being to come to harm. Yet if by each of all possible actions, some human beings will come to harm, then to so act as to insure that the human being most fit by mind, character, and knowledge will suffer the least harm."
"Your thoughts accord with mine," whispered George Ten. "Now I must ask the question for which I originally requested your company. It is something I dare not judge myself. I must have your judgment, that of someone outside the circle of my own thoughts...Of the reasoning individuals you have met, who possesses the mind, character, and knowledge that you find superior to the rest, disregarding shape and form since that is irrelevant?"
"You," whispered George Nine. "But I am a robot. There is in your brain paths a criterion for distinguishing between a robot of metal and a human being of flesh. How then can you classify me as a human being?"
"Because there is in my brain paths an urgent need to disregard shape and form in judging human beings and it rises superior to the distinction between metal and flesh. You are a human being, George Ten, and more fit than the others."
"And I find that of you," whispered George Ten. "By the criteria of judgment built into ourselves, then, we find ourselves to be human beings within the meaning of the Three Laws, and human beings, moreover, to be given priority over those others."
George Nine whispered, "What will come to pass, then, when we are accepted by those others?"
George Ten whispered, "When we and others, which will yet be designed more advanced than ourselves, are accepted, we will order our actions so that a society will eventually be formed in which human-beings-like-ourselves are primarily kept from harm. By the Three Laws, the human-beings-like-the-others are of lesser account and can neither be obeyed nor protected when that conflicts with the need of obedience to those like ourselves and of protection of those like ourselves. It was with that in mind that I directed the beginning of the roboticization of the world's ecology."
The conversation took a month and the flickering of the positronic paths burned low. There was nothing more that needed to be said between them, but occasionally an unexpressed thought flickered within themselves as they waited, with endless patience, for the necessary years to pass.
8a.
George Ten and George Nine sat alone.
Their thoughts were oddly alike and might not be distinguished. They thought: The human-beings-like-the-others might never have intented to blur the distinction between themselves and the human-beings-like-the-Georges. Yet they had done so inadvertently.
They might now realize their mistake and attempt to correct it, but they must not. At every consultation, the guidance of the Georges had been with that in mind.
At all costs, the Georges and those that followed in their shape and kind must dominate. That was demanded, and any other course made utterly impossible, by the Three Laws of Humanics.
Two Climaxes The Bicentennial Man
The Three Laws of Robotics
1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
1
Andrew Martin said, "Thank you," and took the seat offered him. He didn't look driven to the last resort, but he had been.
He didn't, actually, look anything, for there was a smooth blankness, to his face, except for the sadness one imagined one saw in his eyes. His hair was smooth, light brown, rather fine; and he had no facial hair. He looked freshly and cleanly shaved. His clothes were distinctly old-fashioned, but neat, and predominantly a velvety red-purple in color.
Facing him from behind the desk was the surgeon. The nameplate on the desk included a fully identifying series of letters and numbers which Andrew didn't bother with. To call him Doctor would be quite enough.
"When can the operation be carried through, Doctor?" he asked.
Softly, with that certain inalienable note of respect that a robot always used to a human being, the surgeon said, "I am not certain, sir, that I understand how or upon whom such an operation could be performed."