"Or at least not exactly airsick, but nervous. A mild agoraphobia. It's nothing particularly abnormal, but there it is. So I took the expressways."
Baley felt a sudden sharp interest. "Agoraphobia?"
"I make it sound worse than it is," the roboticist said at once. "It's just the feeling you get in a plane. Have you ever been in one, Mr. Baley?"
"Several times."
"Then you must know what I mean. It's that feeling of being surrounded by nothing; of being separated from - from empty air by a mere inch of metal. It's very uncomfortable."
"So you took the expressway?"
"Yes."
"All the way from Washington to New York?"
"Oh, I've done it before. Since they built the Baltimore-Philadelphia tunnel, it's quite simple."
So it was. Baley had never made the trip himself, but he was perfectly aware that it was possible. Washington, Baltimore, Philadelphia, and New York had grown, in the last two centuries, to the point where all nearly touched. The Four-City Area was almost the official name for the entire stretch of coast, and there were a considerable number of people who favored administrational consolidation and the formation of a single super-City. Baley disagreed with that, himself. New York City by itself was almost too large to be handled by a centralized government. A larger City, with over fifty million population, would break down under its own weight.
"The trouble was," Dr. Gerrigel was saying, "that I missed a connection in Chester Sector, Philadelphia, and lost time. That, and a little difficulty in getting a transient room assignment, ended by making me late."
"Don't worry about that, Doctor. What you say, though, is interesting. In view of your dislike for planes, what would you say to going outside City limits on foot, Dr. Gerrigel?"
"For what reason?" He looked startled and more than a little apprehensive.
"It's just a rhetorical question. I'm not suggesting that you really should. I want to know how the notion strikes you, that's all."
"It strikes me very unpleasantly."
"Suppose you had to leave the City at night and walk cross country for half a mile or more."
"I - I don't think I could be persuaded to."
"No matter how important the necessity?"
"If it were to save my life or the lives of my family, I might try...." He looked embarrassed. "May I ask the point of these questions, Mr. Baley?"
"I'll tell you. A serious crime has been committed, a particularly disturbing murder. I'm not at liberty to give you the details. There is a theory, however, that the murderer, in order to commit the crime, did just what we were discussing; he crossed open country at night and alone. I was just wondering what kind of man could do that."
Dr. Gerrigel shuddered. "No one I know. Certainly not I. Of course, among millions I suppose you could find a few hardy individuals."
"But you wouldn't say it was a very likely thing for a human being to do?"
"No. Certainly not likely."
"In fact, if there's any other explanation for the crime, any other conceivable explanation, it should be considered."
Dr. Gerrigel looked more uncomfortable than ever as he sat bolt upright with his well-kept hands precisely folded in his lap. "Do you have an alternate explanation in mind?"
"Yes. It occurs to me that a robot, for instance, would have no difficulty at all in crossing open country."
Dr. Gerrigel stood up. "Oh, my dear sir!"
"What's wrong?"
"You mean a robot may have committed the crime?"
"Why not?"
"Murder? Of a human being?"
"Yes. Please sit down, Doctor."
The roboticist did as he was told. He said, "Mr. Baley, there are two acts involved: walking cross country, and murder. A human being could commit the latter easily, but would find difficulty in doing the former. A robot could do the former easily, but the latter act would be completely impossible. If you're going to replace an unlikely theory by an impossible one - "
"Impossible is a hell of a strong word, Doctor."
"You've heard of the First Law of Robotics, Mr. Baley?"
"Sure. I can even quote it: A robot may not injure a human being, or, through inaction, allow a human being to come to harm." Baley suddenly pointed a finger at the roboticist and went on, "Why can't a robot be built without the First Law? What's so sacred about it?"
Dr. Gerrigel looked startled, then tittered, "Oh, Mr. Baley."
"Well, what's the answer?"
"Surely, Mr. Baley, if you even know a little about robotics, you must know the gigantic task involved, both mathematically and electronically, in building a positronic brain."
"I have an idea," said Baley. He remembered well his visit to a robot factory once in the way of business. He had seen their library of book-films, long ones, each of which contained the mathematical analysis of a single type of positronic brain. It took more than an hour for the average such film to be viewed at standard scanning speed, condensed though its symbolisms were. And no two brains were alike, even when prepared according to the most rigid specifications. That, Baley understood, was a consequence of Heisenberg's Uncertainty Principle. This meant that each film had to be supplemented by appendices involving possible variations.
Oh, it was a job, all right. Baley wouldn't deny that.
Dr. Gerrigel said, "Well, then, you must understand that a design for a new type of positronic brain, even one where only minor innovations are involved, is not the matter of a night's work. It usually involves the entire research staff of a moderately sized factory and takes anywhere up to a year of time. Even this large expenditure of work would not be nearly enough if it were not that the basic theory of such circuits has already been standardized and may be used as a foundation for further elaboration. The standard basic theory involves the Three Laws of Robotics: the First Law, which you've quoted; the Second Law, which states, 'A robot must obey the orders given it by human beings except where such orders would conflict with the First Law,' and the Third Law, which states, 'A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.' Do you understand?"