"With each individual beetle as alive as the squirrel and as yourself?"
"As complete and independent an organism as any other, within the total ecology. There are smaller organisms still; many too small to see."
"And that is a tree, is it not? And it is hard to the touch-"
4a.
The pilot sat alone. He would have liked to stretch his own legs but some dim feeling of safety kept him in the dyna-foil. If that robot went out of control, he intended to take off at once. But how could he tell if it went out of control?
He had seen many robots. That was unavoidable considering he was Mr. Robertson's private pilot. Always, though, they had been in the laboratories and warehouses, where they belonged, with many specialists in the neighborhood.
True, Dr. Harriman was a specialist. None better, they said. But a robot here was where no robot ought to be; on Earth; in the open; free to move-He wouldn't risk his good job by telling anyone about this-but it wasn't right.
5.
George Ten said, "The films I have viewed are accurate in terms of what I have seen. Have you completed those I selected for you, Nine?"
"Yes," said George Nine. The two robots sat stiffly, face to face, knee to knee, like an image and its reflection. Dr. Harriman could have told them apart at a glance, for he was acquainted with the minor differences in physical design. If he could not see them, but could talk to them, he could still tell them apart, though with somewhat less certainty, for George Nine's responses would be subtly different from those produced by the substantially more intricately patterned positronic brain paths of George Ten.
"In that case," said George Ten, "give me your reactions to what I will say. First, human beings fear and distrust robots because they regard robots as competitors. How may that be prevented?"
"Reduce the feeling of competitiveness," said George Nine, "by shaping the robot as something other than a human being."
"Yet the essence of a robot is its positronic replication of life. A replication of life in a shape not associated with life might arouse horror."
"There are two million species of life forms. Choose one of those as the shape rather than that of a human being."
"Which of all those species?" George Nine's thought processes proceeded noiselessly for some three seconds. "One large enough to contain a positronic brain, but one not possessing unpleasant associations for human beings."
"No form of land life has a braincase large enough for a positronic brain but an elephant, which I have not seen, but which is described as very large, and therefore frightening to man. How would you meet this dilemma?"
"Mimic a life form no larger than a man but enlarge the braincase."
George Ten said, "A small horse, then, or a large dog, would you say? Both horses and dogs have long histories of association with human beings."
"Then that is well."
"But consider-A robot with a positronic brain would mimic human intelligence. If there were a horse or a dog that could speak and reason like a human being, there would be competitiveness there, too. Human beings might be all the more distrustful and angry at such unexpected competition from what they consider a lower form of life."
George Nine said, "Make the positronic brain less complex, and the robot less nearly intelligent."
"The complexity bottleneck of the positronic brain rests in the Three Laws. A less complex brain could not possess the Three Laws in full measure."
George Nine said at once, "That cannot be done."
George Ten said, "I have also come to a dead end there. That, then, is not a personal peculiarity in my own line of thought and way of thinking. Let us start again...Under what conditions might the Third Law not be necessary?"
George Nine stirred as if the question were difficult and dangerous. But he said, "If a robot were never placed in a position of danger to itself; or if a robot were so easily replaceable that it did not matter whether it were destroyed or not."
"And under what conditions might the Second Law not be necessary?"
George Nine's voice sounded a bit hoarse. "If a robot were designed to respond automatically to certain stimuli with fixed responses and if nothing else were expected of it, so that no order need ever be given it."
"And under what conditions"-George Ten paused here- "might the First Law not be necessary?"
George Nine paused longer and his words came in a low whisper, "If the fixed responses were such as never to entail danger to human beings."
"Imagine, then, a positronic brain that guides only a few responses to certain stimuli and is simply and cheaply made-so that it does not require the Three Laws. How large need it be?"
"Not at all large. Depending on the responses demanded, it might weigh a hundred grams, one gram, one milligram."
"Your thoughts accord with mine. I shall see Dr. Harriman."
5a.
George Nine sat alone. He went over and over the questions and answers. There was no way in which he could change them. And yet the thought of a robot of any kind, of any size, of any shape, of any purpose, without the Three Laws, left him with an odd, discharged feeling.
He found it difficult to move. Surely George Ten had a similar reaction. Yet he had risen from his seat easily.
6.
It had been a year and a half since Robertson had been closeted with Eisenmuth in private conversation. In that interval, the robots had been taken off the Moon and all the far-flung activities of U. S. Robots had withered. What money Robertson had been able to raise had been placed into this one quixotic venture of Harriman's.