"But the judgments which are required are very complicated."
"Very. The necessity of making such judgments slowed the reactions of our first couple of models to the point of paralysis. We improved matters in the later models at the cost of introducing so many pathways that the robot's brain became far too unwieldy. In our last couple of models, however, I think we have what we want. The robot doesn't have to make an instant judgment of the worth of a human being and the value of its orders. It begins by obeying all human beings as any ordinary robot would and then it learns. A robot grows, learns and matures. It is the equivalent of a child at first and must be under constant supervision. As it grows, however, it can, more and more, be allowed, unsupervised, into Earth's society. Finally, it is a full member of that society."
"Surely this answers the objections of those who oppose robots."
"No," said Harriman angrily. "Now they raise others. They will not accept judgments. A robot, they say, has no right to brand this person or that as inferior. By accepting the orders of A in preference to that of B, B is branded as of less consequence than A and his human rights are violated."
"What is the answer to that?"
"There is none. I am giving up."
"I see."
"As far as I myself am concerned...Instead, I turn to you, George."
"To me?" George Ten's voice remained level. There was a mild surprise in it but it did not affect him outwardly. "Why to me?"
"Because you are not a man," said Harriman tensely. "I told you I want robots to be the partners of human beings. I want you to be mine."
George Ten raised his hands and spread them, palms outward, in an oddly human gesture. "What can I do?"
"It seems to you, perhaps, that you can do nothing, George. You were created not long ago, and you are still a child. You were designed to be not overfull of original information-it was why I have had to explain the situation to you in such detail-in order to leave room for growth. But you will grow in mind and you will come to be able to approach the problem from a non-human standpoint. Where I see no solution, you, from your own other standpoint, may see one."
George Ten said, "My brain is man-designed. In what way can it be non-human?"
"You are the latest of the JG models, George. Your brain is the most complicated we have yet designed, in some ways more subtly complicated than that of the old giant Machines. It is open-ended and, starting on a human basis, may-no, will-grow in any direction. Remaining always within the insurmountable boundaries of the Three Laws, you may yet become thoroughly non-human in your thinking."
"Do I know enough about human beings to approach this problem rationally? About their history? Their psychology?"
"Of course not. But you will learn as rapidly as you can."
"Will I have help, Mr. Harriman?"
"No. This is entirely between ourselves. No one else knows of this and you must not mention this project to any human being, either at U. S. Robots or elsewhere."
George Ten said, "Are we doing wrong, Mr. Harriman, that you seek to keep the matter secret?"
"No. But a robot solution will not be accepted, precisely because it is robot in origin. Any suggested solution you have you will turn over to me; and if it seems valuable to me, I will present it. No one will ever know it came from you."
"In the light of what you have said earlier," said George Ten calmly, "this is the correct procedure...When do I start?"
"Right now. I will see to it that you have all the necessary films for scanning."
1a.
Harriman sat alone. In the artificially lit interior of his office, there was no indication that it had grown dark outside. He had no real sense that three hours had passed since he had taken George Ten back to his cubicle and left him there with the first film references.
He was now merely alone with the ghost of Susan Calvin, the brilliant roboticist who had, virtually single-handed, built up the positronic robot from a massive toy to man's most delicate and versatile instrument; so delicate and versatile that man dared not use it, out of envy and fear.
It was over a century now since she had died. The problem of the Frankenstein complex had existed in her time, and she had never solved it. She had never tried to solve it, for there had been no need. Robotics had expanded in her day with the needs of space exploration.
It was the very success of the robots that had lessened man's need for them and had left Harriman, in these latter times-
But would Susan Calvin have turned to robots for help. Surely, she would have-
And he sat there long into the night.
2.
Maxwell Robertson was the majority stockholder of U. S. Robots and in that sense its controller. He was by no means an impressive person in appearance. He was well into middle age, rather pudgy, and had a habit of chewing on the right corner of his lower lip when disturbed.
Yet in his two decades of association with government figures he had developed a way of handling them. He tended to use softness, giving in, smiling, and always managing to gain time.
It was growing harder. Gunnar Eisenmuth was a large reason for its having grown harder. In the series of Global Conservers, whose power had been second only to that of the Global Executive during the past century, Eisenmuth hewed most closely to the harder edge of the gray area of compromise. He was the first Conserver who had not been American by birth and though it could not be demonstrated in any way that the archaic name of U. S. Robots evoked his hostility, everyone at U. S. Robots believed that.
There had been a suggestion, by no means the first that year-or that generation-that the corporate name be changed to World Robots, but Robertson would never allow that. The company had been originally built with American capital, American brains, and American labor, and though the company had long been worldwide in scope and nature, the name would bear witness to its origin as long as he was in control.