Daneel said, "It would not be possible at all for any human being."
"Why not?"
"Surely, Partner Elijah, you are aware that the robotic skeleton is metallic in nature and much stronger than human bone. Our movements are more strongly powered, faster, and more delicately controlled. The Third Law of Robotics states: 'A robot must protect its own existence.' An assault by a human being could easily be fended off. The strongest human being could be immobilized. Nor is it likely that a robot can be caught unaware. We are always aware of human beings. We could not fulfill our functions otherwise."
Baley said, "Come now, Daneel. The Third Law states: 'A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.' The Second Law states: 'A robot must obey the orders given it by a human being, except where such orders would conflict with the First Law.' And the First Law states: 'A robot may, not injure a human being or, through inaction, allow a human being, to come to harm.' A human being could order a robot to destroy himself - and a robot would then use his own strength to smash his own skull. And if a human being attacked a robot, that robot could not fend off the attack without harming the human being, which would violate First Law."
Daneel Said, "You are, I suppose, thinking of Earth's robots. On Aurora - or on any of the Spacer worlds - robots are regarded more highly than on Earth, and are, in general, more complex, versatile, and valuable. The Third Law is distinctly stronger in comparison to the Second Law on Spacer worlds than it is on Earth. An order for self-destruction would be questioned and there would have to be a truly legitimate reason for it to be carried, through - a clear and present danger. And in fending off an attack, the First Law would not be violated, for Auroran robots are deft enough to immobilize a human being without hurting him."
"Suppose, though, that a human being maintained that, unless a robot destroyed himself, he - the human being - would be destroyed? Would not the robot then destroy himself?"
"An Auroran robot would surely question a mere statement to that effect. There would have to be clear evidence of the possible destruction of a human being."
"Might not a human being be, sufficiently subtle to so arrange matters in such a way as to make it seem to a robot that the human being was indeed in great danger? Is it the ingenuity that would be required that makes you eliminate the unintelligent, inexperienced, and young?"
And Daneel said, "No, Partner Elijah, it is not."
"Is there an error in my reasoning?"
"None."
"Then the effort may be in my assumption that he was physically damaged. He was not, in actual fact, physically damaged. Is that right?"
"Yes, Partner Elijah."
(That meant Demachek had had her facts straight, Baley thought.)
"In that case, Daneel, Jander was mentally damaged. Roblock! Total and irreversible!"
"Roblock?"
"Short for robot-block, the permanent shutdown of the functioning of the positronic pathways."
"We do not use the word 'roblock' on Aurora, Partner Elijah."
"What do you say?"
"We say 'mental freeze-out'."
"Either way, it is the same phenomenon being described."
"It might be wise, Partner Elijah, to use our expression or the Aurorans you speak to may not understand; conversation may be impeded. You stated a short while ago that different words make a difference."
"Very well. I will say 'freeze-out'. - Could such a thing happen spontaneously?"
"Yes, but the chances are infinitesimally small, roboticists say. As a humaniform robot, I can report that I have never myself experienced any effect that could even approach mental freeze-out."
"Then one must assume that a human being deliberately set up a situation in which mental freeze-out would take place."
"That is precisely what Dr. Fastolfe's opposition contends, Partner Elijah."
"And since this would take robotic training, experience, and skill, the unintelligent, the inexperienced, and the young cannot have been responsible."
"That is the natural reasoning, Partner Elijah."
"It might even be possible to list the number of human beings on Aurora with sufficient skill and thus set up a group of suspects that might not be very large in number."
"That has, in actual fact, been done, Partner Elijah."
"And how long is the list?"
"The longest list suggested contains only one name."
It was Baley's turn to pause. His brows drew together in an angry frown and he said, quite explosively, "Only one name?"
Daneel said quietly, "Only one name, Partner Elijah. That is the judgment of Dr. Han Fastolfe, who is Aurora's greatest theoretical roboticist."
"But what is, then, the mystery in all this? Whose is the one name?"
R. Daneel said, "Why, that of Dr. Han Fastolfe, of course. I have just stated that he is Aurora's greatest theoretical roboticist and, in Dr. Fastolfe's professional opinion, he himself, is the only one who could possibly have maneuvered Jander Panell into total mental freeze-out without leaving any sign of the process. However, Dr. Fastolfe also states that he did not do it."
"But that no one else could have, either?"
"Indeed, Partner Elijah. There lies the mystery."
"And what if Dr. Fastolfe - " Baley paused. There would be no point in asking Daneel if Dr. Fastolfe was lying or was somehow mistaken, either in his own judgment that no one but he could have done it or in the statement that he himself had not done it. Daneel had been programmed by Fastolfe and there would be no chance that the programming included the ability to doubt the programmer.