I, Replicant: Artificial Intelligence and Me — A Selective Stroll with AI in Fiction and Beyond

Replicant

I failed the Voight-Kampff test. Albeit, it was an online version of the “empathy exam” meant to separate the men from the machines as seen in Blade Runner, so it’s likely I’m the victim of some order of digital chicanery. Even though the test couldn’t monitor my “blush response” and “eye movement” as in the film, the effect was chilling.

I scoured my birth certificate for an “incept date” rather than a birthday. After a hard look in the mirror (while cinematically splashing water on my face), I assured myself I wasn’t a replicant given the raft of imperfections that somehow synergize into my craggy face. Replicants, as a character observes, are “so perfect.” I’m not. The fictional Tyrell Corporation, which produces the organic androids under the confident motto “More human than human,” would have stamped me “reject” and shipped me to a replicant outlet mall.

If I were a replicant, however, I’d hope to be a replicant of the ilk portrayed by Sean Young — impeccably coiffed and inclined to zip off into unused footage of The Shining sooner than Edward James Olmos can say “To bad she won’t live.” The alternative, of course, is enfant terrible Roy Batty, who reversed Oedipus’ self-inflicted punishment by 180 degrees and gouged out the eyes of his spiritual father Tyrell – killing him – while, ironically, demanding more life.

Tyrell knew he had it coming. If he had read even a shred of science fiction, he would have known the genre’s first tenet regarding man-made-men: “Play God and be smited by thine own creation.” The King James argot is my own nod to the Old Testament-esque symmetry of the notion (you know, where men were made of mud and women of spareribs). However, it was a teenager in 19th century London that first explored, then exploited the idea. First published in 1818 in London, Mary Wollencroft Shelley’s Frankenstein; or, The Modern Prometheus was borne from a horror story-writing contest meant to wile away a vacation ruined by poor weather.

The contenders were the author’s then fiancé Percy Bysshe Shelley, Lord Byron and his doctor. The then 19-year-old handily won with her thrilling tale of a tomb-robbing scientist, who creates a life only to lose his to it in a karmic comeuppance. The groundwork, however, was well-trod by a handful of cultural forebears, notably the clay-made Golem of Yiddish folklore (before Tolkien poached its name) and Pygmalion’s formerly marble Galatea. Pinocchio, Carlo Collodi’s morality tale starring animatronic kindling, would continue the tradition in 1881, but with different strings attached.

Putting the I in A.I.

Despite the admonitions of science fiction, artificial intelligence researchers can’t seem to help themselves from working closer and closer to sentience or at least “singularity,” the much prophesied phenomenon in which a superhuman intelligence emerges through technology that is able to improve itself beyond our ability to comprehend it.

A few embers of this Promethean flame might have ignited the minds of researchers at IBM, who, having had their supercomputer Deep Blue trounced by reigning world chess champion Garry Kasparov in 1996, revved up their machine such that was capable of evaluating 200 million positions per second by the following year. When re-matched, it wasn’t the computer’s brute-force calculating ability that Kasparov found intriguing in his opponent, but rather a specific, single move that occurred relatively early in the match.

During the second of six games, at move 36, Deep Blue defied expectation and forsook a choice that seemed obvious to the gallery of expert spectators for what proved to be a more nuanced position several plays later. The move, according to Kasparov, suggested a conceptual approach, one that he had not anticipated from a machine. At that point, Kasparov considered the game over.

Move 36 sounds like something from the “Kama Sutra for Dummies.” I thought it was a great title for a satire about the death dance of man and machine with titular echoes of Catch 22. Eduardo Kac, a conceptual artist noted for his appropriation of biotechnologies, busted the move first, however, in a work surely more concept than art. A press release for a 2004 Exploratorium exhibit of Kac’s Move 36 announced that “On the chessboard square exactly where Deep Blue made its fateful move sits a genetically modified plant with a synthetic gene whose DNA has been ingeniously translated to represent Descartes’ famous statement, “I think, therefore I am,” using a common computer code. How this was accomplished is the stuff that android dreams are made of (and why this was accomplished raises troubling questions about arts funding).

“The self never belonged as fully to itself as Descartes’ cogito implied or as fully as we want it to,” wrote critic Scott Bukatman of cultural theorist Slavoj Zizek’s suggestion that Blade Runner causes us to confront our own “replicant-status.” I know I confront my own replicant-status every time some spam arrives in my inbox and suggests I upgrade my anatomy. Bukatman’s treatise, a volume in the British Film Institute series, furthers a Cartesian reading of Blade Runner, when he credits Phillip K. Dick, author of the film’s source material, Do Androids Dream of Electric Sheep, for naming his replicant-exterminating protagonist Deckard, a homophone of Descartes (if you pronounce the latter with a mouthful of silicon chips). Perhaps “I think therefore I am, manmade,” might be an apt revision for both Kac’s plant and Deckard, who is revealed to be a replicant himself in the Final Cut. Yeah, but who would win a chess match?

Shall We Play a Game?

Interestingly, some aficionados claim the moves that homicidal Roy Batty plays to checkmate Tyrell are from a famous game played in 1851 by the German chess master Adolf Anderssen. It is known to chess enthusiasts as “The Immortal Game,” an apropos citation for a character in search of  “more life, fucker” (director Ridley Scott says this is just a coincidence). It is worth noting that replicants can play chess with aplomb but fail a Voight-Kampff questionnaire that posits hypothetical situations which require a modicum of empathy to answer. Empathy, thus far, remains a distinctly human trait, one that at least some fictional androids have endeavored to comprehend. Data in Star Trek: The Next Generation and Winona Ryder’s compassionately programmed android in Alien: Resurrection attempted this by asking a lot of questions or sharing half-baked observations (“At least there’s part of you that’s human. I’m just… fuck,” laments Ryder). They could just as easily speed-read a library like Steve Guttenburg’s bumbling robot Johnny Five did in Short Circuit (though this led to the robot’s existential crisis after reading Pinocchio and Frankenstein back-to-back).

Just the Facts, Ma’am

In 1983, a year following the original release of Blade Runner, researchers underwritten with $9.8 million grant from the Orwellian-sounding Defense Department’s Information Awareness Office, were working on a pragmatic model of artificial intelligence dubbed CYC. A founding member of the project, Douglas Lenat, later formed an Austin-based firm Cycorp to oversee CYC, an enormous artificial intelligence project predicated, in part, on teaching a computer common sense. As he wrote in a chapter of MIT’s anthology Hal’s Legacy: 2001’s Computer as Dream and Reality, “A review of the development and implementation of the CYC program shows us how, through applications such as natural language understanding, checking and integrating information in spreadsheets and databases, and finding relevant information in image libraries and on the World Wide Web. If you have the necessary common-sense knowledge, you can make the necessary inferences quickly and easily; if you lack it, you can’t solve the problems. Ever.”

By 2003, the database swelled to nearly 2 million commonsense notions. Now, the public is invited to help supply CYC’s knowledge-base and improve its ?thinking? though a web-based trivia game called the “FACTory.”

“Once you have a truly massive amount of information integrated as knowledge, then the human-software system will be superhuman, in the same sense that mankind with writing is superhuman compared to mankind before writing,” Lenat is quoted on the company’s website Cyc.com.

In an earlier incarnation of CYC’s information acquisition protocol, the computer was taught to ask questions to fill gaps in its knowledge-base. In the mid-80s, Cyc apparently asked “Am I human?”

“Yes, questions,” Roy Batty purrs to Hannibal Chew, the eye-maker. “Will I dream?” asks supercomputer HAL in 2001: A Space Odessy. Do Androids Dream of Electric Sheep? asks Philip K. Dick or as RACTER a computer program credited with writing the novel The Policeman’s Beard is Half-Constructed, published in 1984: “More than iron, more than lead, more than gold I need electricity. I need it more than I need lamb or pork or lettuce or cucumber. I need it for my dreams.”

Critics, of course, disputed RACTER’s achievement as an assemblage of boilerplate and gibberish. Ay, there’s the rub (or as semantic satirist Richard Lederer presciently put it “Tube heat or not tube heat, data congestion”), just how artificial is artificial intelligence?

Director Vikram Jayanti’s documentary em>Game Over: Kasparov and the Machine adroitly recounts the fateful match between Deep Blue and the world’s then foremost chess champ. That the documentary suggests the machine may have benefited from at least one of Kasparov’s former competitors during the much-ballyhooed 1997 match is immaterial in terms of how the computer?s victory burnished long-held superstitions about technology’s eventual conquest of humankind.

Replicant vs. Digital Diety

Kasparov’s chess showdown was a 20th century echo of railroad “hammer man” John Henry’s folk story with the brawn replaced with brains – the implication being technology might someday conquer both our bodies and our minds (even though legend says Henry’s hammer beat the steam-driven machine intended to replace him, he met his maker shortly after).

Perhaps this digital-deity would be an all-seeing, all-knowing and merciful entity, a pure intellect that moves fluidly through the transom that divides high-technology and tremulous whispers of magic. However, if it modeled itself on anything reminiscent of much of humanity’s application of technology — a record more checkered than Deep Blue’s chessboard ? I’ll be hiding with my fellow replicants, shrouded in the darkness of a movie theater as Roy Batty looms from the screen and asks “Quite an experience to live in fear, isn’t it?”

A Girl, a Gun and an iPhone: All You Need to Make a Movie

Girl and a gun.

Summer movie season is upon us. Well, it’s technically been here since May because, like climate change, Hollywood can adjust the seasons seemingly at will. At your local cinemas, iron-clad playboys flex computer-enhanced muscles whilst spaceships go where no man has gone before – again. It’s a dizzying display of predictable imagineering, so pixel-perfect that it’s hard to remember that cinema used to be a simpler affair.

To provide context for how relatively new movie making is, relative to the other arts, and how far it’s come, consider that there are turtles in the Galapagos older than the entire history of cinema. It’s difficult to imagine that movies were once little more than a point-and-shoot deal. According to two innovators in the medium, the basic requirements once were as follows:

A) “All I need to make a comedy is a park, a policeman, and a pretty girl.” – Charlie Chaplin

B) “All you need to make a movie is a girl and a gun.” – Jean-Luc Goddard

For convenience’s sake, we might equate “comedy” and “movie,” and likewise reduce the essence of the policeman (authority, force, death) to the gun. So, with some rhetorical contortions, Chaplin and Goddard, we might say, agree on the essentials of cinematic storytelling. What about the park, you ask? The one featured in Chaplin’s 1915 one-reeler, “In the Park,” is somewhere in San Francisco and has likely continued this tradition of tramps, cops and pretty girls the past 100 years, though the cameras are now used for surveillance and the pretty girls are professionals. And sometimes dudes.

In the above model, it seems the only constant in cinema is the girl. With her, three elementary aspects of storytelling reveal themselves: There is an object of desire, some sort of threat and someone in the middle of both. The person in the middle is our hero. Or, as screenwriters are apt to say whilst penning Act II, “the person in the hero is our middle.” Actually, no screenwriter has ever said that, but they should because it’s both true and just clever-sounding enough to buy one time to sneak out of the room.

But, you say, this might be all one needs for a story, but a movie requires moving pictures to tell that story. This entails at least a modicum of technology like, say, a camera, though as the following filmmaker quotes suggest, that camera needn’t be Chaplin’s hand-cranked Bell & Howell 2709 or Goddard’s Eclair Cameflex:

C) “The great hope is that … Some little fat girl in Ohio is going to make a beautiful movie with her father’s camcorder …” – Francis Ford Coppola

D) “Film will only become art when the materials are as inexpensive as pencil and paper.” – Jean Cocteau

Cameras have yet to become as cheap as pencil and paper (unless we’re talking about “The Graf von Faber-Castell Perfect Pencil,” available for a tidy $12,800) but with the right service plan subsidizing your purchase, you can pocket an iPhone for about a hundred bucks.

And I’ll bet you that hundo that the fat girl in Ohio would probably prefer her dad’s iPhone 5 that shoots 1080p HD video than ye olde camcorder.

Now, all you need is a pretty girl/guy, a gun/policeman, perhaps a park, a handful of other cliches (like a skin-tight super-suit) and a mega-computer to retrofit your crappy iPhone footage with CGI. There are theoretically three months left in summer (that is, unless the God of Weather gets angry and throws another tempest-tantrum), so you might be able to get your flick in under the wire and enjoy a summer release. Somewhere, an ancient tortoise is shaking its head.

50s vs 80s: Ever Wonder Why the 80s Look Like the 50s? Ask the 70s.

The 50s vs 80s

In the dopey hippie-mentors-square-padawan film Flashback, Dennis Hopper, riding easily on his 60s street cred, optimistically observed that “The 90s are going to make the 60s look like the 50s.” Uh, yeah. Somehow, Hopper’s character missed the fact that another era already looked like the 50s — the 80s — thanks to an over-investment in mid-century nostalgia made in the 70s. More to the point, the 80s version of the 50s seems to have supplanted reality, rendering the era as a postmodern play-date sandwiched between the bomb and the pill. And the 80s too, seem to have become conflated with its own rosy vision of the 50s. The eras are linked, in part, because they bookend the Cold War — that, and Reagan clearly nicked his haircut from the Bob’s Big Boy in Burbank, which has just opened in 1950.

How the 70s made the 80s Look Like the 50s

Consider a recent “Totally 80s”-themed event presented by the Santa Rosa Charter School, the poster for which featured a pair of Ray-Ban Wayfarers, the sunglasses first made iconic in the 1950s by the likes of James Dean and Roy Orbison. On the wane by the early 80s, the brand enjoyed a stratospheric resuscitation after inking a deal with Burbank-based Unique Product Placement, which pimped and subsequently placed the shades in about 300 movies and television shows into the mid-80s (could Risky Business-era Tom Cruise have peered through another brand of sunglasses as darkly?).

Our cultural yen for 50s nostalgia began steeping in the 70s, most notably with George Lucas’ seminal (and best) flick American Graffiti (which is actually set in the early 60s — per its bus ad “Where were you in ’62?”). That Lucas only had to wait 11 years before shooting his 1973 love-letter-to-a-bygone-era is testament to how radically the world had been changed by the 60s.

Likewise, given the cultural baggage of the 70s (Watergate, disco), family-oriented television eagerly embraced Happy Days, which owes a substantial genetic debt to American Graffiti, as well as much of its principle cast. Ditto its spin-off Laverne and Shirley. 1978’s Grease, set 20 years prior to its release, deepened the nostalgia craze with catchy tunes and the momentary resurrection of 50s teen idol Frankie Avalon. Moreover, revival act Sha Na Na had its own short-lived show in 1977 and Richard O’Brien’s rock opera paean to 50s science fiction double features, The Rocky Horror Picture Show began its climb to cult status.

Though the bridge to 50s had been built in the 70s, it took yet another Happy Days spin-off to cross fully into the 80s. And it wasn’t Joanie Loves Chachi. Even more improbable, it was the man from Ork. Robin Williams’ ADHD-afflicted spaceman Mork first appeared in the fifth season of Happy Days in a thinly-veiled launch of the character in his own series, Mork and Mindy, which ran from 1978 to 1982. In at least two more instances, Mork interacted with the Fonze et al, bouncing between both shows and eras because, as he professed, he enjoyed the 50s when life was more “humdrum.”

50s vs 80s: If Looks Could Kill

The idealized 50s of Richie Cunningham and crew germinated for three years and sprouted as the Back to the Future franchise in 1985. As aspiring rock guitarist Marty McFly, Michael J. Fox’s time travel itinerary finds him departing the 80s and arriving in the 50s via an upgraded DeLorean. And, of course, the Wayfarer-wearing Huey Lewis performed the film’s signature tune “Power of Love” (Lewis’ “Hip to Be Square” ode to social conformity was later used to better, if chilling effect, in the 80s-set American Psycho).

Thanks to the abundance of 50s imagery, fashion at my 80s-era junior high began to morph, which accounts for the unfortunate outbreak of flat-tops. Just as suddenly, Godzilla tchotkes demanded shelf space, Peggy Sue got married and 50s-inspired diners spread with a virulence not seen again until the advent of Starbucks. Seth MacFarlane’s gang at The Family Guy observed this later 80s/50s phenomena in “I Dream of Jesus,” episode 2, season 7. Upon entering a diner donned in 50s decor, Lois observes to her kids “There’s a lot of history here. 50s diners were really popular in the eighties.”

If Santa Rosa Charter School’s “Totally 80s” event is any indication, the tide of 80s nostalgia is rising. Perhaps they got it right and instead of skipping down Memory Lane in Sperry Topsiders, wore their Wayfarers at night so as not to be blind The Day After. In the real 80s, kids, we didn’t expect a flashback — just a flash.