Logical Surprise

The entire Internet, as well as the types of devices represented by the desktop computer, the laptop computer, the iPhone, the iPod, and the iPad, are a continuing inescapable embarrassment to science fiction, and an object lesson in the fallibility of genre writers and their vaunted predictive abilities. (Yes, yes, we all know that "SF is not about predicting things." But have you ever seen any writer turn down credit when they do hit the fortune-telling bullseye?)

 

Hardly a single story in the genre prior to, oh, say, 1970, exhibited an accurate handle on computers. As a rule, there were no far-sighted, speculative depictions of the devices' miniaturization, ubiquity, influence, and utility that would prefigure the landscape of 2010 as we know it. Oh, sure, you can point to a few isolated instances of authors writing on the digital cutting edge. One example that is trotted out regularly, like a token Cassandra-accurate economist amidst boom-inflating hedge fund managers, is Murray Leinster and his story, "A Logic Named Joe," from 1946:

You got a logic in your house. It looks like a vision receiver used to, only it's got keys instead of dials and you punch the keys for what you wanna get. It's hooked in to the tank, which has the Carson Circuit all fixed up with relays. Say you punch "Station SNAFU" on your logic. Relays in the tank take over an' whatever vision-program SNAFU is telecastin' comes on your logic's screen. Or you punch "Sally Hancock's Phone" an' the screen blinks an' sputters an' you're hooked up with the logic in her house an' if somebody answers you got a vision-phone connection. But besides that, if you punch for the weather forecast or who won today's race at Hialeah or who was mistress of the White House durin' Garfield's administration or what is PDQ and R sellin' for today, that comes on the screen too. The relays in the tank do it. The tank is a big buildin' full of all the facts in creation an' all the recorded telecasts that ever was made—an' it's hooked in with all the other tanks all over the country—an' everything you wanna know or see or hear, you punch for it an' you get it. Very convenient. Also it does math for you, an' keeps books, an' acts as consultin' chemist, physicist, astronomer, an' tea-leaf reader, with a "Advice to the Lovelorn" thrown in.

But for every Leinster there were a thousand other writers with their heads buried in the sand, such as the otherwise on-target Robert Heinlein, and his character Andrew Jackson "Slipstick" Libby, famed mathematical genius who helped pilot starships—with his slide rule! 

 

It was not until the appearance of cyberpunk in the 1980s that SF began to grapple in a broadly meaningful way with the reality of computers as something other than giant mainframes tended by crewcut IBM nerds. But the irony—and the point of the aforementioned lesson—is that the information about the potential paradigm-shattering role that computers might play in society was extant as early as the late 1930s, coincident with the birthpangs of actual computers.

 

Admittedly, it wasn't headline material in the daily newspapers. But any SF writer of that era—and of subsequent decades—with the willingness to dig into the scientific and industrial and military journals would have found a rich vein of extrapolative material that would have allowed a more sharp-eyed assessment of where computers might be heading. While there were indeed secrets involved in early computer technology that would not emerge for decades, the suggestive, extendable mainline of the technological arc was there for the winkling-out. Had SF authors of the period been inclined to investigate, the whole course of the genre would have been altered. But, just as today, commercial regurgitation of received ideas trumped pioneering ideation based on hard facts.

 

What exactly were the public details surrounding the invention of the modern computer? Thanks to author Jane Smiley, best known for such literary excursions as her Pulitzer-winning novel A Thousand Acres, we can now get a comprehensive overview of that exciting period through her newest book, The Man Who Invented the Computer. She follows the John McPhee-perfected recipe for historical journalism nicely and with élan:  take an abstruse subject, research it deeply, then humanize it tenderly, adding off-kilter insights and sharp portraits of the curious folks involved.

 

Smiley's book is subtitled "The Biography of John Atanasoff, Digital Pioneer."  And while the named subject does indeed occupy center stage, the narrative covers so much more ground than one man's life, from the early years of the twentieth century (Atanasoff's youth) up to a pivotal court decision in 1973. As Smiley says in her introduction, the book is like four movies playing simultaneously.

 

First come the character portrait and career outline of Atanasoff, a cornfed Edison of sorts. It's a tale out of Sinclair Lewis, as if replayed by Hugo Gernsback. We see the forces that shaped young Atanasoff, his remarkable epiphany in 1937 that led to the construction of the first workable, practical electronic computer. We follow his retreat from the field, his long hegira in other realms of expertise, and his eventual return in the 1960s to claim his proper credit.

 

The second narrative is a fairly well-known one, involving Alan Turing, the superstar of the field. Smiley, nodding to the familiarity of Turing's life, gives him just enough coverage to place him in context. Here we have something out of Eric Ambler or John Buchan. But the third strand is definitely the weirdest. It's the saga of Konrad Zuse, an isolated, eccentric German trying to invent a computer out of junk parts prior to and during WWII. This bit reminds me of Gravity's Rainbow, and I kept waiting for Tyrone Slothrop to appear around every bend of the subplot.

 

Lastly we get what might be termed the "institutional/big business" side of the tale. Inventors Mauchly and Eckert, having ripped off Atanasoff, produce ENIAC and other computing machines, with the help of the military, corporations, and famous scientists such as John von Neumann, opening the floodgates for a million digital flowers to bloom, until a major trial in the late 1960s restores Atanasoff's honor and precedence. This segment might have been authored by Norman Mailer handing off to John Grisham.

 

Smiley blends all these convergent and parallel narratives into a superb whole, as fetching and gripping as any novel. She displays an unwavering, cogent grasp of all the technical details, a keen eye for historical forces, and much psychological insight; her prose is a model of smooth transparency. Anyone who wants to understand the roots of our twenty-first century digital culture needs to read this book.

 

But if science fiction's track record for predicting the computer's path to world domination is a poor one, that doesn't mean the genre isn't catching on. To see how computers are being portrayed in near-future scenarios, it's worth having a look at Robert Sawyer's WWW: Watch, a sequel to WWW: Wake.

 

Sawyer's earlier book introduced us to teenaged heroine Caitlin Decter, whose blindness is being treated by an experimental new technology that inadvertently puts her in communication with the rudimentary but evolving intelligence bootstrapping itself out of the worldwide web. She dubs it Webmind, and a strange friendship is begun.

 

The notion of an autonomous cybermind arising spontaneously as an emergent property of complexly networked systems is hardly new. Perhaps the first full instantiation of the trope occurred in Heinlein's The Moon Is a Harsh Mistress (thereby restoring to the Grand Master some of the speculative street cred he lost with "Slipstick" Libby). Curiously enough, the same year we got the Heinlien novel, we also received D. F. Jones's Colossus, which employed the same concept. After that watershed the trope was firmly in place, surfacing at regular intervals, with one other notable early instance being David Gerrold's When HARLIE Was One. Nowadays, when such a concept is invoked, it's usually tied to the notion of the "Singularity" (the postulated moment when the distinction between human and machine minds will vanish) and posthumanism, a route Sawyer seems disinclined to follow, hewing to more old-fashioned developments.

 

Sawyer has never been a flashy or far-out writer. No transcendent leaps or gonzo forays into SF surrealism for him. His preferred mode is methodical, step-by-step unfolding of a solid idea, with verisimilitude given a priority. Consequently, much of the first two volumes of this projected trilogy will strike more seasoned readers of the genre as highly familiar and unadventurous. I suspect that even those whose acquaintance with SF is limited to first-generation Star Trek reruns will not have their minds blown.

 

But on the other hand, Sawyer's cautious, slow approach, homely details, and plain-spoken prose succeed in creating an introductory-level text that has the virtue of making the whole concept of machine intelligence seem highly probable and comprehensible. Writing alternate passages in the voice of Webmind, Sawyer crafts a sincere portrait of non-human intelligence and perceptions, developing alongside his likeable human human characters. Caitlin and Webmind mature and evolve in parallel, illustrating both the differences and consanguinity of the two classes of intelligence and self, organic and electronic.

 

Sawyer's book is low on action sequences. A bit of thriller-style suspense comes from the presence of WATCH, a government agency charged with monitoring suspicious doings on the Internet. They naturally become aware of Webmind, with predictable hostile reactions. But the conflict embodied in their response is outweighed by the discursiveness of the rest of the story. In true Asimovian fashion, the play of ideas as they emerge in rational conversation forms the real excitement for Sawyer. The reader will exit this novel feeling that the computer—a gadget so fortuitously and aleatorily invented, as Smiley shows us—was somehow predestined to emerge as mankind's true companion.

About the Columnist
Author of several acclaimed novels and story collections, including Fractal Paisleys, Little Doors, and Neutrino Drag, Paul Di Filippo was nominated for a Sturgeon Award, a Hugo Award, and a World Fantasy Award -- all in a single year. William Gibson has called his work "spooky, haunting, and hilarious." His reviews have appeared in The Washington Post, Science Fiction Weekly, Asimov's Magazine, and The San Francisco Chronicle.

July 25: On this day in 1834 Samuel Taylor Coleridge died of heart disease at the age of sixty-one.

Crime fiction legends Dennis Lehane and Michael Connelly discuss the new book that unites their beloved sleuths Patrick Kenzie and Harry Bosch.

advertisement
Books, CDs, DVDs to know about now
Paradise and Elsewhere

Canadian short story marvel Kathy Page emerges as the Alice Munro of the supernatural from these heartfelt tales of shapeshifting swimmers, mild-mannered cannibals, and personality-shifting viruses transmitted through kisses.

Pastoral

When a persuasive pastor arrives in a sleepy farm town, his sage influence has otherworldly results (talking sheep, a mayor who walks on water). But can he pull off the miracle of finding kindly local Liz Denny the love of her life?  Small wonder looms large in this charmer from Andre Alexis.

The Hundred-Year House

When a poetry scholar goes digging through the decrepit estate of his wife's family to uncover a bygone arts colony's strange mysteries, he awakens a tenacious monster: his mother-in-law. A wickedly funny take on aging aristocracies from author Rebecca Makkai (The Borrower).