Francis Quinn broke it with a heavy attempt at lightness, "Bluff. He's making it up as he goes along."
"Are you going to gamble on that, Mr. Quinn?" asked Dr. Calvin, indifferently.
"Well, it's your gamble, really."
"Look here," Lanning covered definite pessimism with bluster, "we've done what you asked. We witnessed the man eat. It's ridiculous to presume him a robot."
"Do you think so?" Quinn shot toward Calvin. "Lanning said you were the expert."
Lanning was almost threatening, "Now, Susan-"
Quinn interrupted smoothly, "Why not let her talk, man? She's been sitting there imitating a gatepost for half an hour."
Lanning felt definitely harassed. From what he experienced then to incipient paranoia was but a step. He said, "Very well. Have your say, Susan. We won't interrupt you."
Susan Calvin glanced at him humorlessly, then fixed cold eyes on Mr. Quinn. "There are only two ways of definitely proving Byerley to be a robot, sir. So far you are presenting circumstantial evidence, with which you can accuse, but not prove - and I think Mr. Byerley is sufficiently clever to counter that sort of material. You probably think so yourself, or you wouldn't have come here.
"The two methods of proof are the physical and the psychological. Physically, you can dissect him or use an X-ray. How to do that would be your problem. Psychologically, his behavior can be studied, for if he is a positronic robot, he must conform to the three Rules of Robotics. A positronic brain can not be constructed without them. You know the Rules, Mr. Quinn?"
She spoke them carefully, clearly, quoting word for word the famous bold print on page one of the "Handbook of Robotics."
"I've heard of them," said Quinn, carelessly.
"Then the matter is easy to follow," responded the psychologist, dryly. "If Mr. Byerley breaks any of those three rules, he is not a robot. Unfortunately, this procedure works in only one direction. If he lives up to the rules, it proves nothing one way or the other."
Quinn raised polite eyebrows, "Why not, doctor?"
"Because, if you stop to think of it, the three Rules of Robotics are the essential guiding principles of a good many of the world's ethical systems. Of course, every human being is supposed to have the instinct of self-preservation. That's Rule Three to a robot. Also every 'good' human being, with a social conscience and a sense of responsibility, is supposed to defer to proper authority; to listen to his doctor, his boss, his government, his psychiatrist, his fellow man; to obey laws, to follow rules, to conform to custom - even when they interfere with his comfort or his safety. That's Rule Two to a robot. Also, every 'good' human being is supposed to love others as himself, protect his fellow man, risk his life to save another. That's Rule One to a robot. To put it simply - if Byerley follows all the Rules of Robotics, he may be a robot, and may simply be a very good man."
"But," said Quinn, "you're telling me that you can never prove him a robot."
"I may be able to prove him not a robot"
"That's not the proof I want."
"You'll have such proof as exists. You are the only one responsible for your own wants."
Here Lanning's mind leaped suddenly to the sting of an idea, "Has it occurred to anyone," he ground out, "that district attorney is a rather strange occupation for a robot? The prosecution of human beings - sentencing them to death - bringing about their infinite harm-"
Quinn grew suddenly keen, "No, you can't get out of it that way. Being district attorney doesn't make him human. Don't you know his record? Don't you know that he boasts that he has never prosecuted an innocent man; that there are scores of people left untried because the evidence against them didn't satisfy him, even though he could probably have argued a jury into atomizing them? That happens to be so."
Tanning's thin cheeks quivered, "No, Quinn, no. There is nothing in the Rules of Robotics that makes any allowance for human guilt. A robot may not judge whether a human being deserves death. It is not for him to decide. He may not harm a human-variety skunk, or variety angel."
Susan Calvin sounded tired. "Alfred," she said, "don't talk foolishly. What if a robot came upon a madman about to set fire to a house with people in it He would stop the madman, wouldn't he?"
"Of course."
"And if the only way he could stop him was to kill him-"
There was a faint sound in Lanning's throat. Nothing more.
"The answer to that, Alfred, is that he would do his best not to kill him. If the madman died, the robot would require psychotherapy because he might easily go mad at the conflict presented him -of having broken Rule One to adhere to Rule One in a higher sense. But a man would be dead and a robot would have killed him."
"Well, is Byerley mad?" demanded Lanning, with all the sarcasm he could muster.
"No, but he has killed no man himself. He has exposed facts which might represent a particular human being to be dangerous to the large mass of other human beings we call society. He protects the greater number and thus adheres to Rule One at maximum potential. That is as far as he goes. It is the judge who then condemns the criminal to death or imprisonment, after the jury decides on his guilt or innocence. It is the jailer who imprisons him, the executioner who kills him. And Mr. Byerley has done nothing but determine truth and aid society.
"As a matter of fact, Mr. Quinn, I have looked into Mr. Byerley's career since you first brought this matter to our attention. I find that he has never demanded the death sentence in his closing speeches to the jury. I also find that he has spoken on behalf of the abolition of capital punishment and contributed generously to research institutions engaged in criminal neurophysiology. He apparently believes in the cure, rather than the punishment of crime. I find that significant."