"Ordered to by whom?" asked Robertson in honest astonishment. (No one ever told him anything, he thought resentfully. These research people considered themselves the owners of U. S. Robots, by God!)
"By the plaintiff," said Dr. Calvin. "In heaven's name, why?"
"I don't know why yet. Perhaps just that we might be sued, that he might gain some cash." There were blue glints in her eyes as she said that.
"Then why doesn't Easy say so?"
"Isn't that obvious? It's been ordered to keep quiet about the matter."
"Why should that be obvious?" demanded Robertson truculently. "Well, it's obvious to me. Robot psychology is my profession. If
Easy will not answer questions about the matter directly, he will answer questions on the fringe of the matter. By measuring increased hesitation in his answers as the central question is approached, by measuring the area of blankness and the intensity of counterpotentials set up, it is possible to tell with scientific precision that his troubles are the result of an order not to talk, with its strength based on First Law. In other words, he's been told that if he talks, harm will be done a human being. Presumably harm to the unspeakable Professor Ninheimer, the plaintiff, who, to the robot, would seem a human being."
"Well, then," said Robertson, "can't you explain that if he keeps quiet, harm will be done to U. S. Robots?"
"U. S. Robots is not a human being and the First Law of Robotics does not recognize a corporation as a person the way ordinary laws do. Besides, it would be dangerous to try to lift this particular sort of inhibition. The person who laid it on could lift it off least dangerously, because the robot's motivations in that respect are centered on that person. Any other course-" She shook her head and grew almost impassioned. "I won't let the robot be damaged!"
Lanning interrupted with the air of bringing sanity to the problem. "It seems to me that we have only to prove a robot incapable of the act of which Easy is accused. We can do that."
"Exactly," said Defense, in annoyance. "You can do that. The only witnesses capable of testifying to Easy's condition and to the nature of Easy's state of mind are employees of U. S. Robots. The judge can't possibly accept their testimony as unprejudiced."
"How can he deny expert testimony?"
"By refusing to be convinced by it. That's his right as the judge. Against the alternative that a man like Professor Ninheimer deliberately set about ruining his own reputation, even for a sizable sum of money, the judge isn't going to accept the technicalities of your engineers. The judge is a man, after all. If he has to choose between a man doing an impossible thing and a robot doing an impossible thing, he's quite likely to decide in favor of the man."
"A man can do an impossible thing," said Lanning, "because we don't know all the complexities of the human mind and we don't know what, in a given human mind, is impossible and what is not. We do know what is really impossible to a robot."
"Well, we'll see if we can't convince the judge of that," Defense replied wearily.
"If all you say is so," rumbled Robertson, "I don't see how you can."
"We'll see. It's good to know and be aware of the difficulties involved, but let's not be too downhearted. I've tried to look ahead a few moves in the chess game, too." With a stately nod in the direction of the robopsychologist, he added, "With the help of the good lady here."
Lanning looked from one to the other and said, "What the devil is this?"
But the bailiff thrust his head into the room and announced somewhat breathlessly that the trial was about to resume.
They took their seats, examining the man who had started all the trouble.
Simon Ninheimer owned a fluffy head of sandy hair, a face that narrowed past a beaked nose toward a pointed chin, and a habit of sometimes hesitating before key words in his conversation that gave him an air of a seeker after an almost unbearable precision. When he said, "The Sun rises in the-uh-east, 11 one was certain he had given due consideration to the possibility that it might at some time rise in the west.
Prosecution said, "Did you oppose employment of Robot EZ-27 by the university?"
"I did, sir."
"Why was that?"
"I did not feel that we understood the-uh-motives of U. S. Robots thoroughly. I mistrusted their anxiety to place the robot with us."
"Did you feel that it was capable of doing the work that it was allegedly designed to do?"
"I know for a fact that it was not."
"Would you state your reasons?"
Simon Ninheimer's book, entitled Social Tensions Involved in Space-Flight and Their Resolution, had been eight years in the making. Ninheimer's search for precision was not confined to his habits of speech, and in a subject like sociology, almost inherently imprecise, it left him breathless.
Even with the material in galley proofs, he felt no sense of completion. Rather the reverse, in fact. Staring at the long strips of print, he felt only the itch to tear the lines of type apart and rearrange them differently.
Jim Baker, Instructor and soon to be Assistant Professor of Sociology, found Ninheimer, three days after the first batch of galleys had arrived from the printer, staring at the handful of paper in abstraction. The galleys came in three copies: one for Ninheimer to proofread, one for Baker to proofread independently, and a third, marked "Original," which was to receive the final corrections, a combination of those made by Ninheimer and by Baker, after a conference at which possible conflicts and disagreements were ironed out. This had been their policy on the several papers on which they had collaborated in the past three years and it worked well.