Computing Culture Group

The Science of Subversion: MIT's Computing Culture Group

Weekly Dig, November 2003, Carolyn Johnson

CCG circa 2005

CCG circa 2005

The basement of the Visual Arts Center at MIT is a giant rec room for the technologically gifted. Clear plastic drawers filled with brightly colored Lego-like blocks line the walls. A large reproduction of “Guernica,” Picasso's grisly commentary on the Spanish Civil War, screams out from a far corner. Dogs wander around the room, weaving between clusters of comfy chairs and couches where students work or schmooze. Chris Csikszentmihalyi, the blue-eyed, sandy-haired principal investigator of the MIT's Computing Culture Group (CCG) remarks that a visitor once described the scene to him as “Montessori for adults.”

MIT's Media Lab, which hosts the CCG in their postmodern playroom, is a bizarre department that blends technology, art and agitprop in idiosyncratic and sometimes nonsensical ways. Csikszentmihalyi's research is only one of the many projects that the Media Lab supports, which include everything from Robotic Life to Lifelong Kindergarten. These aren't the kind of projects that most people envision when they think of a laboratory - especially at the nation's foremost technological institution. There are no white lab coats, no sterile black countertops, and little of the deafening white noise associated with minus 80-degree freezers or gigantic centrifuges. Students here don't string together theories about the origins of the universe; instead they contrive high-tech ways to jolt governments, scientists and ordinary people out of their complacency toward machines. According to Csikszentmihalyi, “MIT's got so much momentum that it can do things like the Media Lab, that are pretty darn weird, and it doesn't hurt the credibility of MIT that much.” In an official MIT overview of the lab, potential investors are advised that, “Many sponsors find the Media Lab to be a uniquely valuable resource for conducting research that is too costly or too 'far out.'”

Far out is only one way of describing the research that Csikszentmihalyi and his group are involved with - another is “quixotic technologies.” Still another is “subversive.” In fact on the surface many of the projects - be they remote-controlled Afghan war journalists, counter-surveillance measures for everyday folks, or a dynamic public database containing massive amounts of personal information about politicians - just look like pranks played on the Bush administration.

But these projects aren't mere pranks, and they aren't (necessarily) motivated by anger at the U.S. government. Csikszentmihalyi's politically charged experiments are based on philosophical arguments. His research, he says, is “motivated by who gets to make reality.”

When Csikszentmihalyi started out, his milieu was the future. He thought up disastrous scenarios that could develop from current social trends, and then developed appropriate technological solutions for those scenarios. It was a strange blend of technology, art, and activism. “For instance,” he says, “one was 'what if the Christian right came to power,' what would be a technology that would be appropriate for that situation?”

He called his solution The Scopes Beacon (after John Scopes, the Tennessee school teacher who was charged with illegally teaching evolution in his biology class in 1925).

“I was thinking okay, what if there is a pending Handmaid's Tale kind of period where evolution isn't allowed to be taught. [The Scopes Beacon] was a time capsule I left under the Clarence Darrow Memorial Bridge [named for the lawyer who successfully defended John Scopes] in Chicago.”

The time capsule would illustrate the concept of evolution to the people of 2046, who might presumably have forgotten it in a country controlled by Christian fundamentalists. It teaches evolution by demonstration: a string of code evolves into letters and words over many cycles (generations). A sequence of random characters is “selected” and mutated over many trials, culling the letters that conform to a model that will survive the selection process. After 40-60 generations, the symbols and letters eventually form the sentence: “In the beginning the world was without form and void...” The machine continues until it has evolved the entire first chapter of the book of Genesis. It's clever and artistic, and borderline scientific.

But Csikszentmihalyi stopped creating his artistic future technologies when he saw one of his dystopian “solutions” (a robot whose “binary moral code is to meet aggression with aggression, bullet for bullet”) discussed as a viable option on CNN. “At that point I stopped working in the future because there was nothing I could think of that was too weird for it to actually develop. So then I started working in the past.”

It wasn't until he arrived at the Media Lab that Csikszentmihalyi gained access to the resources to forsake both past and future, and draw a bead on the present.

Take his Afghan eXplorer, for instance. Shortly after Csikszentmihalyi arrived at MIT two years ago, the U.S. went to war against Afghanistan. Csikszentmihalyi, who is particularly concerned about the ethics of using autonomous systems like cruise missiles and unmanned aerial vehicles, presented Afghan eXplorer, an autonomous robot that could respond to the depersonalized state of war, and get around strict governmental controls regarding the media.

“One of the reasons that made it so easy for people to get behind going to war is because they didn't really have to see war,” he says. “If you look at the history of technologies, war is always about hurting other people without getting hurt yourself. Things like cannons, trebuchets, catapults and arrows are all about separating yourself so the other person can't get back at you. But it's gotten to a point where you're separated from the people you're bombing by such a distance that they look like little ants or rectangles or factories or things like that. We've essentially developed a system of warfare where Americans don't even have to look at what they're destroying.”

So he created an unmanned vehicle of his own. His robot journalist, modeled after the Mars explorer and outfitted with digital video cameras, audio and an intercom, was capable of doing remote interviews with people through a video screen interface perched on top of a long, thin “neck.” The war was over before the robot was functional - and in any case, an autonomous robot wouldn't survive long in normal everyday conditions like traffic, much less a battlefield - but the project attracted a good deal of media attention and became a possible prototype. For Csikszentmihalyi, it seems, the point was made.

Some call this “tactical intellectual work.” It's similar in spirit to biologist Stuart Newman's 1999 attempt to file a patent on a chimpanzee-human hybrid (“Humanzee”) as a political statement about the biotech industry's frightening and arbitrary ownership of life. Like Afghan eXplorer, the Humanzee wasn't necessarily immediately practical (no such hybrid has ever been concocted), but it posed important questions for scientists and a creepy, thought-provoking scenario for the general public. According to Csikszentmihalyi, this kind of work is absolutely essential during a time when huge advances in science are being reached, nearly all of which are underwritten by the government or military. “Robotics is an entire area of human inquiry where people - scientists in good faith - are trying to figure things out about cognition and behavior, and how to make things that are lifelike, but the entire subtext of their research is military funded. What would it be like to develop things that have a different pretext from the beginning? So that's what we're doing.”

Of course, war isn't the only realm of modern society that is heavily influenced by politics. “Science and engineering as fields dissimulate their politics,” Csikszentmihalyi notes. “They pretend to not have politics, but in fact they're incredibly political. There's almost no one who is an aero[nautics] or astro[nautics] major who doesn't know a lot about military planes, because military planes are cutting-edge technology. So people are geeked about them. They really like them. They're excited by the fact that these machines are pushing boundaries and envelopes. They're the foreign policy arm of a particular set of political agendas.”

Those agendas, Csikszentmihalyi believes, must be revealed. His projects aren't blind jabs at the present administration - he doesn't judge all of its decisions bad a priori - but are attempts to transform science and engineering into more self-aware fields of inquiry. He and his students attempt to reverse the traditional uses of technology, regardless of what politicians promote them.

For example, he works with graduate student Ryan McKinley on a web-based project called the Open Government Information Awareness Project (GIA). Whereas the government's controversial Terrorism Information Awareness program (whose former title, Total Information Awareness, was quietly scrapped for its Orwellian ring) gives the government a readily accessible record of citizens' financial, personal and medical statuses, GIA attempts to reverse that scrutiny by allowing citizens to post minutiae about politicians' lives: where their kids go to school, where their summer houses are, where they get their money, their books, their groceries. “GIA,” according to the Web site, “embraces TIA's total clarity.”

McKinley designed GIA to work off of fundamental principles. The U.S. is built on a model in which power is shared between the government and the people. When the balance goes askew, technology can become a tool used for fighting back. “Can we make a technical system that insures that balance remains and that it doesn't get pitched to one side too much?” asks Csikszentmihalyi. “A technical system can't help totalitarianism because we can develop a technical system that can work against that.” Contrary to popular futuristic nightmares, in which machines begin to challenge the well-being and security of human life, it may be possible to trust machines to deepen, expand and protect our humanity.

CCG also works on a whole class of projects that rejects politicization in another, subtler way. Csikszentmihalyi claims that, “Outside of pure research, there are two things people build in the US: tools for productivity and tools for entertainment. Everything fits in those two agendas. We're exploring a whole set of technologies that fit into a particular emotional moment that no one else designs for otherwise.” A hug machine, a scream machine, a war machine, a fear machine.

Take the scream-body. In a video presentation, graduate student Kelly Dobson, a red-haired pixie with a soft, high-pitched voice presents one of her projects: “Hi, I'm Kelly, and this is my scream-body. Do you ever find yourself in a situation where you really have to scream, but you can't? Because you're at work, or you're in a classroom, or you're watching your children, or you are in any number of situations where it's just not permitted? Well, scream-body is a portable space for screaming. When the user screams into the scream-body their scream is silenced... but is also recorded, for later release where, when and how the user chooses.”

Kelly's work is, as one reporter commented to Csikszentmihalyi, “not a cure for cancer,” but CCG tries to foster an environment in which distinctions of that nature don't rule out any specific project. Research projects are unhindered by things as arbitrary as perceived usefulness, market demand or even impossibility, because what “counts” as truly scientific is often guided by poorly informed public opinion, about what is or isn't acceptable. Science, as anyone who has attempted the most basic chemistry experiment must know, is an uncertain, imaginative endeavor.


Ryan McKinley and Chris Csikszentmihalyi are standing around the very embodiment of uncertainty and imagination: a small lawnmower motor with a miniature helicopter propeller mounted on top. This is the newest project - an unmanned aerial vehicle designed not to drop bombs in faraway lands, but to take pictures, monitor the environment, and, of course, to make a point to a government that would never think to pour research and development money into such an endeavor. The men gleefully discuss how it nearly took out a group of tourists when they revved it up in the parking lot.

Csikszentmihalyi mentions Ray Bradbury's story, “The Veldt.” Two kids, whose parents have given them free reign over a hologram room, think so intensely that their holograms come alive. They create an African savannah, complete with lions that devour the parents. In Bradbury's tale, the parents' indulgence is extreme, and technology leaves them helpless and vulnerable; their plan to return to idyllic Iowa is cut short by their untimely bloody consumption by their children's mental lions.

In the Media Lab, however, technology is cast in a significantly different light. People, it seems clear, absolutely belong in this technological universe. Computers can protect and enhance our entertainment, efficiency and our personal freedoms. But where does Chris Csikszentmihalyi belong? In the academic world whose motto is “Publish or Perish,” he doesn't quite fit in: “There isn't a journal for me. For artists, the equivalent is a group show and the equivalent of a monograph is a solo show. So that's kind of what an artist should be doing. God knows what I should be doing.”