Isaac and I
September 14, 2020 at 7:36 am by Frank White
My writing career has been marked with many ups and downs, an experience that is familiar to most authors. One of the high points was the opportunity, during the 1990s, to co-author two books with Isaac Asimov, one of the writers whose work had a great influence on me. Like many of us who are fascinated by space exploration, robots and androids, as well as the future in general, I was an avid consumer of science fiction from an early age. I'm pretty sure the first science fiction novel I ever read had been written by Isaac. In any event, I eventually graduated to I, Robot, Caves of Steel, Foundation, and much, much more.
In the case of the first co-authored book, Think About Space, I was hired to write the initial draft, with Isaac following up with major edits (or so we assumed). The publisher had agreed, as I recall, that we would be co-authors, which was enough for me, though the pay was pretty small. As it turned out, Isaac thought the first draft was so good that I should get full credit as the sole author, something the publisher did not want to do. So Isaac did add edits to the draft, and I was given co-author credit with him, which was a great honor for me.
When it came to the second book, March of the Millennia, the situation was almost completely reversed. Isaac had written the first draft and I was asked to edit it. I put a lot of work into it, but did not expect Isaac to be so generous a second time. However, he requested that I be listed as co-author of that book as well!
I have found that the successful people of this world are often quite humble and very generous to others. So I have done my best to emulate that approach in thought and action with people whom I have met over the years. (I realize that I did not know Isaac well, and I am not trying to write a biography of him; I am focusing only on his kindness to me and his talent as a writer.)
In July of 2018, Rob Godwin, the founder and owner of Apogee Space Books, and I were asked to speak to an online science fiction book club founded by John Grayshaw, director of the Middletown Public Library in Middletown, PA.
We were sent questions, which we then answered, and the session was published on Facebook. If you would like to see the entire transcript, including Rob's answers, just check out the Middletown Library website.
Science Fiction Book Club
Interview with Frank White and Rob Godwin, July 2018
Frank White Answers
Q: Because of ambiguity in human language, there is almost always a disconnect between the intended meaning of a rule and the interpretation of a rule. This is evident in Asimov's three rules of robotics. The rules are written using high-level, ambiguous language. They are moralistic and idealistic rather than formulaic. But they are interpreted very strictly by the robots. Did Asimov intentionally design the laws of robotics this way at the outset, to use the ambiguity between the intended meaning of the rule and the rigid interpretation of the rule as a thematic/plot device? Or did his themes surrounding these rules develop after he came up with the rules?
FW: I am not sure of the answer, but I think it is a bit of both. It seems to me that the key is this: the rules can contradict one another in real situations, which leads to interesting plot twists. I doubt that he saw all the potential issues when he laid out the rules, though he could see ahead to some of them. As he wrote, the results evolved, which is usually true with any form of fiction.
Q: Rigid interpretation of the rules in Asimov's stories often leads to dire consequences. Were the rules intended to show that common sense must be an ingredient for laws to function properly?
FW: I believe so. Actually, we humans are not so different than the robots. We have laws (rules) that we try to obey, but then we have real situations to consider. For example, “Thou Shalt Not Kill” seems straightforward, but if an intruder breaks into your home at night, what do you do? Well, then we have more laws about that, and perhaps we get into self-defense. But what if the intruder did not have a gun? And so on.
Q: Will robots in the future actually follow Asimov’s laws? Did Asimov believe they would? Or was it just good for storytelling?
FW: People involved with artificial intelligence research (AI) bring up Asimov’s Laws all the time. There seems to be a feeling that he has done as good a job as any in creating robots we do not have to fear. But then, the same thing happens in reality as in the stories. It gets complicated and everyone says, “We need something better.”
Q: Why did Asimov move away from Foundation series and Robot Series after the late 50s and why did he return to it in the 80s.
FW: I don’t know, but as a writer, I can speculate. Topics for writing fiction bubble up from the subconscious and they can have a lot of momentum for a long time and then they dry up, only to reemerge later. That may have happened. Also, writers pick up on the environment of the time and the 80s might have been more conducive to those topics in the 80s than in the 60s and 70s.
Q: What was the inspiration for the development of robots? How did he come up with the 3, and ultimately 4, robotic laws? Did he plan for the underlying plot with the robots in the full series of Foundation or was it something that developed as the story developed. And along that line, was Foundation planned all the way through to its conclusion or was that developed as each book was published?
FW: I have an answer to the first question, but not to the others. Regarding the first question, I interviewed Asimov for my book, The SETI Factor, in 1989. The book is out of print, but most of the interview is in the book and you might find it interesting. I suggested that humans had evolved from automatically fearing aliens to being more comfortable with them. He said “I hope you’re right. Our experience rests in the European exploration of the world, in which we enslaved the natives we found and then killed them off. We expect the aliens to be as bad as the Europeans were, but have now learned that it isn’t right to kill off natives or even an endangered species.”
So he saw our fear of aliens as being a projection of our own worst behavior and as we behaved better, our projections became more benign. Anyway, he did not want to write science fiction that showed aliens as evil. That is what led to the robots. Here is a footnote from the book: “In an interview with the author, Asimov explained that John Campbell, perhaps the most important science fiction editor at the time, mandated that humans should always win out over extraterrestrials in any conflicts or competitions they might have. Asimov did not want to cooperate with this dictum, so he created two series that had no extraterrestrials in them.” These were, of course, the Foundation series and the robot series.
Regarding the whole question of planning, I would share my experience, once again, as a writer. I wrote a novel many years ago about contact with extraterrestrials called Decision: Earth. As I continued writing it, I became increasingly more interested in what I called at the time “Computer agents.” These were AIs like the Siris or Alexas of today. I didn’t plan it, but it just happened. I don’t know if Isaac planned it out or if it evolved, but I suspect the latter.
Q: Did he start out with the intention of connecting so many of his novels into one universe?
FW: Again, I don’t know, but he probably started out thinking of them separately and then saw the value of connecting them.
Q: Would he have identified more with the spacers or those that remained behind on Earth?
FW: A great question. If I am right that he was agoraphobic, I think he would have identified with those who stayed on Earth. Also, the writing implies a certain degree of skepticism regarding how dependent the Spacers become on their robots. They were somewhat like slave owners, I think.
Q: Is anyone planning on reissuing the books Asimov on Science Fiction and/or Asimov's Galaxy: Reflections on Science Fiction, or at least some of the essays in these two books? Asimov is known for his essays as well as his science fiction, but I think some of his most interesting essays are on the subject of science fiction itself.
FW: I don’t know. Walker Publishing, which published the two books I co-authored with him, had plans for reissuing a lot of his work when he died. I was supposed to help them with the project and was saddened at his passing for so many reasons, but partially because it meant that initiative would not happen.
Q: I’m a big fan and have read all of his SF---most, several times. Would you consider him as a better writer of engaging human characters, or engaging robotic characters? Why were his stories relatively “dry” of emotion and pathos?
FW: I think he was better at creating robot characters than at creating human characters. As I read more and more of his Foundation work, it seemed to me that the robots were evolving and becoming better than humans. Perhaps he intended this to be the case. In any event, he was first and foremost a scientist and he may have felt more comfortable with rationality more than with emotion.