Thursday, November 24, 2005

...Toil and Trouble.

The 11-Year Quest to Create Disappearing Colored Bubbles

Mike Haney

Tim Kehoe has stained the whites of his eyes deep blue. He's also stained his face, his car, several bathtubs and a few dozen children. He's had to evacuate his family because he filled the house with noxious fumes. He's ruined every kitchen he's ever had. Kehoe, a 35-year-old toy inventor from St. Paul, Minnesota, has done all this in an effort to make real an idea he had more than 10 years ago, one he's been told repeatedly cannot be realized: a colored bubble.

No, not the shimmering rainbow effect you see when the light catches a clear soap bubble. Kehoe's bubble would radiate a single, vibrant hue throughout the entire sphere—a green bubble, an orange bubble, a hot-pink bubble. It's a bubble that can make CEOs giggle and stunned mothers tear up in awe. It's a bubble you don't expect to see, conditioned as you are to the notion that soap bubbles are clear. An unnaturally beautiful bubble.

Kehoe made a bubble like that when he was 26, after only two years of trashed countertops and chemical fires. He showed it to toy-company executives, who called it a "holy grail." And then it broke, as bubbles always do. And when it did, the dye inside escaped onto clothes and carpets and walls and skin, staining everything it touched. The execs told him to come back with a bubble they could wash off their boardroom table.

That was nine years ago. In the intervening years, Kehoe continued to mix, boil, and brew with endless enthusiasm and little success. Until one day, his stubborn persistence led him to $500,000 in financial backing, enough to hire a dye chemist. Together, they took Kehoe's obsession to an outcome even more amazing than he had ever hoped, an outcome no one could have anticipated for the simple reason that no one imagined it possible. The secret to nonstaining colored bubbles, it turns out, is a dye that could unlock a revolution in color chemistry. All you need to do is make color disappear.

Anatomy of a Bubble

Bubbles, the plain kind, have been around for as long as there have been water and surfactants, a material found mainly in soaps that interacts with water to reduce surface tension. This allows the fluid to spread across a bubble wand without breaking. Introduce air, and the thin film pushes outward until it eventually detaches, forming a bubble. People have been onto this for at least 400 years; 17th-century Flemish paintings show children blowing bubbles with clay pipes.

In the world of toys, where the average shelf life of a product is less than 18 months, bubbles are a juggernaut. A Chicago company called Chemtoy began selling bubble solution in the 1940s, and the fad never wore off. According to one industry estimate, retailers sell around 200 million bottles annually—perhaps more than any other toy.

Despite their enduring appeal, bubbles haven't been improved much in 60 years, the only significant exception being in 2002, when SpinMaster in Toronto introduced Catch-A-Bubble, clear bubbles that lasted as long as five minutes. Time magazine called it one of the year's top inventions, and seven million bottles sold the first year.

The market for lasting bubbles is the same as the market for clear bubbles: elementary-school kids. If an inventor could somehow add color, though, suddenly adults might have reason to start blowing again. Picture bubbles in NFL team colors, or bubbles that match charity ribbons. The potential market would grow to include every man, woman and child. So why don't they exist? It turns out that coloring a bubble is an exceptionally difficult bit of chemistry. A bubble wall is mostly water held in place by two layers of surfactant molecules, spaced just millionths of an inch apart. If you add, say, food coloring to the bubble solution, the heavy dye molecules float freely in the water, bonding to neither the water nor the surfactants, and cascade almost immediately down the sides. You'll have a clear bubble with a dot of color at the bottom. What you need is a dye that attaches to the surfactant molecules and disperses evenly in that water layer. Pack in more dye molecules, get a deeper, richer hue. Simple. Well, on paper anyway.

Toy Story

Tim Kehoe is just over six feet tall, with a build he prefers not to call "portly." He lives in an old brick house in St. Paul, across the street from the elementary school he attended and where two of his four kids now go. He is the embodiment of phrases like "Minnesota nice" and "Midwestern work ethic," a shirt-off-his-back kind of guy who finishes what he starts and who's usually starting something.

"It's hard not to get excited about whatever Tim is excited about," says Charlie Girsch, another toy inventor who has been a mentor to Kehoe ever since Kehoe stole, and in 1995 married, Girsch's son's girlfriend, Sherri.

Kehoe grew up in a stoic Irish house, but Sherri came from a big, raucous Italian clan. During Kehoe's first Christmas with his future in-laws, the grandmas and cousins and kids all gathered in the living room to play Pictionary. The game was boisterous and hilarious, and Kehoe couldn't believe what a blast he had. That night he left with a new calling—to, as he puts it, "solve the problem of how to have fun."

His first attempt, in 1989, was a board game about recycling called Save the Earth that was about as much fun as it sounds. Toy companies were unimpressed, but one rejection letter pointed Kehoe to an independent toy rep named Frank Young. Kehoe hounded him for months with dozens of ideas, until finally Young gave the tenacious kid $30,000 a year to create toys for him full-time. For the next year, Kehoe worked day or night, whenever inspiration hit. Young's confidence (and the casual, come-and-go schedule) fueled Kehoe's creativity, and the ideas poured out: a toy truck with tires that children could pump to monster-truck size; colored sand that hardens in an Easy-Bake toy oven; colored soap bubbles.

Colored soap bubbles! Of course! Everyone loves blowing bubbles. It seemed such a simple and perfect idea, the kind that would leave other inventors slapping their foreheads and saying Why didn't I think of that? Kehoe says, "I remember walking
down to the store thinking, ‘This is so easy. I'm going to be rich!' "

"I started with Jell-O, because I thought, ‘Well, it's got pretty intense color.' So I mixed Jell-O and Ivory soap. I got nothing." Undeterred, he went back to the store and tried food coloring. Then hair dye. Then ink. Within weeks, he was taking Sherri on dates to the grocery store, where he would buy as many colored products as he could afford. Back in his kitchen, he'd dump the Fruit Roll-Ups or Juicy Juice into a pan, heat it on the stove until he figured the color was loosened up, and pour in the dish soap. Only clear bubbles emerged.

When he realized that the answer probably couldn't be found on a store shelf, he started studying patents and reading about surfactants. "I'd see a chemical mentioned in a patent, and when we had some extra money, I'd order it and start mixing," he says. Once he tried nitric acid, a toxic chemical that gives off red fumes at room temperature. "I got it making a really cool bubble, but it could've killed somebody," he recalls. "It ate through clothes." What had been a simple, ingenious idea was becoming an obsession. The idea that colored bubbles might make a few children happy had been a great reason to start the project, and that it could make him a millionaire was a good line for his very patient wife (the only other person who knew what he was up to). But those ambitions were not what kept Kehoe up nights. What drove him crazy was a single question, one that taunted him with every clear bubble that came off his wand: Why can't it be done?

A Burst of Color

One year and 115 prototypes after Young and Kehoe met, money was getting tight at Young's. Not enough of their toys were hits, and Young couldn't afford to keep Kehoe around. So Kehoe pitched himself to Bruce Lund, who ran a 12-man invention studio in Chicago that was high on recent successes like Vac-Man, the archenemy of the Stretch Armstrong elastic doll. Lund ran his shop like a factory. A bell told inventors when to be at their desks and when to take breaks. New ideas were expected every Monday morning and were expected to be good. "I saw grown men cry on a regular basis," Kehoe says.

Bubbles took a backseat while Kehoe spent his nights and weekends trying to come up with enough new dolls not to get fired. Within a year, he'd had it with sweatshop life, and he and Sherri moved back to Minnesota. He launched his own toy company, Kick Design, mostly to get back to bubbles full-time.

Color remained elusive, but his try-anything approach kept plenty of other strange bubbles floating across his kitchen. One exploded with a loud bang. Another gave him chemical burns when it popped. The best one bounced, just like a Super Ball. He thought he could have sold that one, but he couldn't re-create it. He could rarely re-create any of his experiments. "I never wrote anything down," he says. "I'd get too excited as I was doing it. But once I lost that bouncing bubble, I was crushed. I started videotaping myself so that next time I'd know more than ‘It was something on that side of the kitchen.' "

Ask Kehoe now to describe the day the first colored bubble appeared, and the details are fuzzy. He remembers dipping his wand into a pot of blue solution (although they produced clear bubbles, most of his solutions were colored by then) and looking at the quivering film, thinking that this one seemed different. He blew, and a bubble floated across the room. It was blue. He tried again. The next bubbles were blue too. He called Sherri in to make sure he wasn't hallucinating. No, she agreed, it was a blue bubble. As far as they knew, the world's first blue bubble. In his kitchen. How could this be? He hadn't added any special ingredient. He was just playing around with the variables—heating this a little longer, dumping in this before that—and something worked. How didn't matter. Kehoe wasn't after a theory; he was after a bubble, and that he had, on videotape. As far as he was concerned, the project was finished. All that was left was to collect his license deal. So he started showing his tape to toy companies.

"A guy at Hasbro told me they had tried it for two years, and mine were better than anything they had seen, visually," Kehoe says. Every executive who saw them was stunned by their beauty, and everyone told him they could put clear bubbles out of business.

"The problem," Kehoe says, "was that if the bubbles touched you, they stained your skin for weeks. It ruined everything. Everybody said the same thing: Call me when you get it right. So I went back to work."

Partners in the Bubble Lab

The chemistry behind Kehoe's first colored bubbles, the floating spheres of dye eager to stain the next thing they touched, was altogether basic. He'd found a dye and a process of mixing it with the surfactant that caused the two to bond. That meant that the color would stay uniformly distributed around the bubble as long as the surfactants did—which is to say, as long as the bubble was intact. But the dye was only barely water-soluble, so it was nearly impossible to wash. Kehoe hoped others would just license his proof of concept and perfect the formula themselves, but all the toy companies that rejected him realized something Kehoe didn't: that the chemistry was still a long way from workable.

With his bubbles staining boardrooms across the country and a new baby and house to pay for, Kehoe had to move on. What he did with those next eight years—starting a Web-design business, then moving to another company (where I first met him)—isn't really important but for one key event. In 2003 the software company he was working for was sold, putting him out of a job and making its founders rich. This inspired him to return to toys full-time, and the founders' fond opinion of Kehoe inspired them to launch a new toy company with him, 50-50. Kehoe threw in 219 ideas; they threw in half a million dollars.

Only after the deal was secure and Kehoe cashed the check did he tell them about the bubbles. "I'd been avoiding it because I knew they'd get excited and want to do it," Kehoe says. "And I didn't know that I could." In eight years of intermittent experiments, he had created bubbles in dozens of colors, with dozens of dyes, yet never one that was washable enough to sell. "You're asking for magic," Kehoe says. "I tried to talk them out of it, but they were adamant. I knew sheer money or manpower still might not do it, and how could I let them down?"

But that Friday his business partner Guy Haddleton, the one who signed the checks, told him to bring the bubbles in on Monday morning. So Kehoe pulled out the old pots and powders and set about destroying Sherri's new marble countertops.
"And I couldn't get it," he says. "All Friday night, into Saturday morning, I'm trying everything I thought I did before, and all I'm seeing is clear bubbles." He now suspects that Proctor & Gamble changed some small ingredient in its dish soap that caused it to react differently. "I really panicked. I went to the store and tried every soap I could find. Nothing worked."

If he couldn't fix it with soap, he had to find a new dye. "I cleaned out stores of any products with color. The clerks thought I was nuts. I spent hundreds of dollars buying one of everything. One store had these specialty inks that were $30 a bottle that I had never tried. So I raced home and started mixing—failure after failure. I freaked out, wondering how I would explain to Guy that his money may have been better invested on the 100-to-1 pony in the eighth race at Del Mar.

"Then one of the inks worked. It made the most wonderful colored bubbles I had ever seen. And they washed off my skin without scrubbing. I had never tried it because it was a pigment-based product, and I gave up on pigments years ago [because they tend to stain more than dyes]. But these behaved more like dyes and were skin-washable." Kehoe and Sherri dumped the solution on their clothes and kids, and every time it washed out. When Haddleton saw the bubbles on Monday, he was thrilled. The long years of desk jobs and desperate late-night experiments were finally over. He had done what the toy companies had told him to, and now it didn't matter what they thought. He had his own well-financed company and a washable bubble. It was time to tell the world.

Play Date

In July 2004 Kehoe and his partners invited dozens of kids and their parents to Haddleton's estate on Sunfish Lake, near St. Paul, for a bubble unveiling and focus-group party. They hired a film crew and rented massive bubble machines to fill the air with the washable solution that, they figured, would be on store shelves in a matter of months.

The first five minutes of the party were stunning. Mothers gasped, and a few were even moved to tears, at the initial sight of the strangely vivid orbs almost glowing in the sunlight. Kids shrieked and chased after them. It was the moment Kehoe had pictured all those years—not big checks or fame, just seeing this project reach its end in a single joyous afternoon.

And then the bubbles broke—on the kids, on the parents, on cars, on Haddleton's prized German shepherds. It looked like there had been a paint fight. Kehoe had told the parents that the color would wash out, but it didn't matter. Not when their children were covered head to toe in blue and pink splotches, when the color was getting into their shoes and hair and soaking into the concrete. In the faces of the horrified mothers, Kehoe immediately grasped the lesson. "You can't go to market with something that leaves that much color, even if it is washable," he says. "It freaks people out."

Just when he thought he'd succeeded, he'd failed again. Washable wasn't good enough. He needed color that disappeared on its own, that would never stain any surface it touched. But in the history of organic chemistry, no one had ever created a water-soluble dye that disappeared on its own. And Kehoe, despite his years of tinkering, was no chemist.

Calling in the Expert

Ram Sabnis is a leader among a very small group of people who can point to a dye-chemistry Ph.D. on their wall. Only a handful of universities in the world offer one, and none are in the U.S. (Sabnis got his in Bombay). He holds dozens of patents from his work in semiconductors (dying silicon) and biotechnology (dying nucleic acids).

Sabnis wasn't the first chemist to reply to Kehoe's deliberately vague ad. He was just the first one who didn't think that what Kehoe and his partners wanted—a water-soluble disappearing dye that could color the very thin wall of a bubble—was impossible. Sabnis told them he'd have it ready to market in a year. Like Kehoe, Sabnis doesn't seem to consider the possibility that a problem can't be solved. But even he had no idea how hard this one would turn out to be.

"This is the most difficult project I have ever worked on," Sabnis says now. "You think it's easy. Why could someone not make it? But when you actually do it, it's just impossible." For months, he ran 60 to 100 experiments a week, filling notebooks with sketches of molecules, spending weekends in the library studying surfactant chemistry, trying one class of dyes after another.

The breakthrough finally happened in an empty lab in Minneapolis on a Sunday this past February. As with Kehoe's first bubble, it arose from the slow, subtle refinement of a process over thousands of experiments. But Sabnis could re-create it. He synthesized a dye that would bond to the surfactants in a bubble to give it bright, vivid color but would also lose its color with friction, water or exposure to air—not fade, not transfer to something else, but go away completely, as though it had never been there. When one of these bubbles breaks on your hand, rub your hands together a few times and look: Poof.
Magic. No more color. If the bubble breaks on your shirt or the carpet or the dog, you have two choices: Dab it with a touch of plain water to remove it immediately, or forget about it for half an hour. Either way, the color will soon be gone. Sabnis's solution was to build a dye molecule from an unstable base structure called a lactone ring that functions much like a box. When the ring is open, the molecule absorbs all visible light save for one color—the color of the bubble. But add air, water or pressure, and the box closes, changing the molecule's structure so that it lets visible light pass straight through. Sabnis builds each hue by adding different chemical groups onto this base.

"Nobody has made this chemistry before," Sabnis says. "All these molecules—we will make 200 or 300 to cover the spectrum—they don't exist. We have synthesized a whole new class of dyes." Sabnis also impressed Darlene Carlson, a former 3M chemist who helped Kehoe and his partners write the job ad. "What Ram did was an extremely difficult bit of chemistry," she says. "Somebody without his experience in dyes would not even know where to start."

Without the lactone structure (a phrase Kehoe had never heard before Sabnis presented it), Kehoe might have toiled in his basement for many more years and never made the dye he needed. Yet without Kehoe's obsessive dedication and belief in the idea, the project never would have been funded. And without his years of experimentation, Sabnis's dyes would have slipped straight down the walls of the bubbles.

Introducing Zubbles

Colored bubbles will hit shelves this February, if not sooner, under the brand name "Zubbles." The bottles are shaped like little bubble characters. Each color has its own name and personality—Zilch, the villain in black, is a favorite among boys. Girls prefer the pink Zilli. Kehoe is in talks with several major toy companies, and this time, they're begging him for a deal. Even though bubbles are a traditional summertime toy, Toys-R-Us told him that he'd be a fool not to have the bubbles in stores by Christmas. As Popular Science went to press, Kehoe was looking for a partner with a factory that could keep the formula secret and crank out a million units in six weeks. When Kehoe isn't blowing bubbles for businessmen, he's at home inventing again, coming up with new uses for the disappearing dye, the importance of which is hard to overstate. For decades, the color industry has been focused entirely on color fastness. No one has really thought about the potential of temporary color. That the dye was created for children's bubbles may turn out to be just a footnote, a funny story Sabnis tells at color-chemist conventions.

Among the ideas Kehoe has already mocked up are a finger paint that fades from every surface except a special paper, a hair dye that vanishes in a few hours, and disappearing-graffiti spray paint. There's a toothpaste that would turn kids' mouths a bright color until they had brushed for the requisite 30 seconds, and a soap that would do the same for hand washing.
He's also thinking outside the toy chest, mucking around in the lab on weekends making things like a Swiffer that leaves a momentary trace showing where you've Swiffered and a temporary wall paint that would let you spend a few hours with a color before committing to it. The dye's reach is so great that there are even biotech and industrial uses being discussed. "We've got stuff in the works I can't talk about that'll blow bubbles away," he says excitedly. It might take years, but, knowing Tim Kehoe, we'll see them eventually. After all, it's only a little extra work.

Mike Haney is a senior associate editor at Popular Science.

Copyright © 2005 Popular Science

Your guess is as good as mine.

What Happens When Science is Made in China?

A Seed exclusive from Beijing

by Mara Hvistendahl • Posted November 23, 2005 03:02 PM

Credit: Michele Junior

When a patient arrived in the neurosurgery ward of Shanghai's Fudan University Huashan Hospital with a chopstick protruding from one eye, surgeon Zhu Jianhong was not surprised. He knew that Shanghai dinners are long affairs, lubricated with shots of 110-proof grain alcohol, and that when tensions boil over, chopsticks can become weapons. As Zhu extracted the utensil, it occurred to him to culture the brain tissue that was stuck to it. At the time, scientists thought cells from only two regions of the brain were expandable; this tissue was from neither. But Zhu's experiment worked. The next few times he was confronted with a head wound, Zhu took his work one step further, transplanting the expanded neural cells back into the patient's brain. The six patients he treated over the next three years showed better recovery than untreated patients. A delegation of British scientists who visited Zhu last year was uniformly impressed, calling the study "ground-breaking" in a government report. "There's nobody else in the world who's even close to doing that," said Stephen Minger, director of the stem cell biology lab at King's College London. Indeed, much of the work the Chinese are doing with stem cells simply could not be conducted in most other parts of the world: The proliferation of chopsticks notwithstanding, China has one of the most liberal environments in the world for stem cell research. While the ethical debate over the use of embryos in research continues to rage in much of the West, researchers like Zhu have the Chinese government and popular sentiment firmly behind them.

For decades, China was barely a blip on the scientific radar. Communism's arbitrary appointments, combined with the Cultural Revolution's disdain for education, crippled Chinese science. But today China is in the midst of a scientific revolution. China's current economic and political strategy, as named by President Hu Jintao at a recent Central Committee meeting, is the "scientific development concept." The idea is to balance economic growth with attention to China's growing social issues, many of which could be better tackled with the tools that science and technology provide. On the ground, it means that China is developing the sciences now, in the same rapid, breathtaking way that China overhauled its economy. The Chinese government is pouring money into everything from biotechnology to its ambitious space program, which culminates this month in the launch of the manned shuttle Shenzhou VI. And as with most things, timing is key. The steaming Chinese economy, combined with greater opportunities for professional advancement than in the West, is convincing many of the 600,000 students China has sent overseas since the late 1970s that now is the time to return. All of the principal scientists on China's Human Genome Project team—and half of the scholars in the Shanghai branches of the Chinese Academy of the Sciences and the Chinese Academy of Engineering, the major Chinese science institutes—are returnees. The government increased its funding to domestic education by 600% between 1991 and 2001, and it continues to go up. According to research done by Rice University, by 2010, if current trends continue, more than 90% of all scientists and engineers in the world will be living in Asia, and many of those in China. Equipped with fluency in English, Chinese scientists now publish in the international journals that are the barometer of scientific success in the West. Between 1988 and 2001, article output grew by a factor of five (in the same period it only increased by 10% in the U.S.).

The mood in Chinese science is energetic, buoyant and even, as one Western science administrator described it, "euphoric." China is determined to show the West that it can develop scientifically even as it does so economically—that it can turn out impressive achievements with less than half the funding allotted to the sciences in the West. And that it can do so, in some cases, more efficiently.

China has the habit of appending "with Chinese characteristics" to its new theories and ideologies—as in, most famously, "socialism with Chinese characteristics," the term for China's breed of authoritarian capitalism. The young, ambitious, and highly-educated Chinese who are spearheading China's scientific revolution are not only doing science at a world-class level. They are making it their own. This is science with Chinese characteristics. And it may very well change the world of science as we know it.

At the Beijing Genomics Institute (BGI), the center that delivered China's contribution to the Human Genome Project, white-coated researchers grab watermelon slices as they brush through the lobby. Next to the plate of fruit is a ceramic piggy bank. "It's empty," jokes a smiling Bin Liu, the center's assistant director. Liu, who earned his doctorate at McGill University and did postdoctoral research at the National Institutes of Health, doesn't seem fazed. The institute is housed in a dirty white-tile building near the airport; it keeps a Wednesday to Sunday workweek because electricity in its industrial park is rationed. On walls throughout the building are signs in block capital letters that read "Get it Done," alongside the Nature and Science covers depicting BGI's grand-slam of genome sequence drafts. When Liu talks about BGI, he focuses on this work. He admits that much of it was performed under "impossible circumstances" but he objects to the suggestion that the Chinese sciences are still developing. "In nanotech we are actually ahead of America. Would you say that America is developing? The scenario has changed."

What's perhaps most significant about this realignment is its potential: what it might mean in 10 or 15 years if China consistently leads the field. And, certainly, China is not just interested in pulling off a series of quick tricks; it also has a policy-linked plan for the sciences, mapped out to 2020. Wu Yishan, a senior researcher at the Institute of Scientific and Technical Information of China, a governmental advisory agency, says that, compared with the US, China excels at developing long-term plans for scientific development, in part because it doesn't have an election every four years to hold it back. The moves it has made to encourage biotech and nanotech clearly indicate that someone in the Chinese government has been considering the long term says Shere Abbott, chief international officer for the American Association for the Advancement of Science. Commenting on the difference between what she saw on visits to China in 1995 and in 2005, Abbott was impressed: "It's huge," she says, "It tells you a lot about their ability to sit down and develop a national strategy." Robert Lanza, VP of scientific development at Advanced Cell Technology said he was "blown away" on a recent trip to China. "In many ways their thinking was more advanced than our own." If China maintains its current pace, the impact could move quickly beyond science to have a wider cultural impact. "China's right on the verge," says Lanza, "and the culture goes hand in hand with the science and the economics."

But, to be sure, more than one Chinese science administrator has complained about budgetary shortfalls and debt, and while some Western scientists who have toured Chinese labs suggest that they are, in some cases, better outfitted than Western ones ("It was like going to Cambridge," said Minger from King's College), funding remains a very real concern for many institutions. Despite its consummate planning, Wu says, China lags behind the West in commercializing and profiting from its science. More-entrepreneurial researchers have developed fluid relationships with private Chinese companies and many successful government institutes have a commercial arm. But many projects remain underfunded.

Because of this, perhaps, the government has opened its doors to a variety of ambitious foreign projects. Last year, France's prestigious Pasteur Institute established a center on emerging diseases in Shanghai, and Germany's Helmholtz Association, a consortium of research entities, set up a center focused on energy, environment and space in Beijing—its first outside of Europe. Denmark's Centre for Clinical and Basic Research will soon establish a center in Beijing. And a host of multinationals—including giants like Microsoft, Intel and GE—have established R&D facilities in China, with more poised to follow suit.

Danish pharmaceutical company Novo Nordisk occupies a wing of a sleek, slate gray building with mirrored windows in plush Zhongguancun, China's Silicon Valley, a half-hour drive from Beijing's research universities. Its labs are eerily quiet, as if half of the researchers are on vacation. They have, in fact, not yet arrived; the company plans to double the center's staff in the next three to five years. Companies like Novo Nordisk are in China because the country has a large pool of cheap, educated labor, with Ph.D.s receiving annual salaries under $10,000. Indeed, China is listed as the top destination for future R&D spending by many multinationals. But China, as any Western entrepreneur who has tried to set up shop here can attest, is hardly a country that lets itself be exploited. Niels Blume, director of the cell biology department at Novo's Beijing center, says the government insists that the company give back to the community; Novo Nordisk has bankrolled a Ministry of Health diabetes education initiative and spearheaded an insulin donation program. The company's greater impact, however, will be in the way it trains the next generation of Chinese scientists. The empty offices in Novo Nordisk's building—which Blume says the government sees as "an incubator for Chinese biotech companies"—may eventually be filled with researchers who cut their teeth at Novo Nordisk. In this sense the foreign labs play a role similar to that of overseas universities, offering Chinese scientists a chance to acquire specialized knowledge which they can then go on to apply at Chinese companies. Indeed, a recent industry report by Blackwell on managing R&D in China highlighted the high turnover experienced by many multinational research centers in China.

With sound expertise in commercialization added to its repertoire, China will be equipped to further define the "Chinese characteristics" of its science. And chances are the outcome won't look much like Western science. In 19th-century China, imperial leaders promoted the slogan zhong xue wei ti, xi xue wei yong, or "Eastern learning for foundation, Western learning for application." Many Chinese still subscribe to that maxim. By most indications, China intends to draw on the West for practical knowledge, while turning out science with a distinctly Chinese flavor. BGI's draft sequences read like a survey of Chinese staples: rice, chicken and silkworm genomes. Wu, whose job includes monitoring foreign science, says the agency is pushing Chinese scientists to focus on what he calls "appropriate technology," such as animal-powered farm machines and other tools that could be used in small-scale rural agriculture. Peasants still account for over half of the Chinese population and unprofitable farming techniques are forcing many to abandon the land to search for work in China's cities. This October, the Shenzhou VI astronauts will carry on board with them a container of pig sperm—not as a strange stunt but as an experiment directed at better engineering pork, which is at the center of the rural Chinese diet. All of these pursuits derive from an issue that is of great importance to China: feeding a growing population with a mounting food supply problem. (The World Watch Institute predicts that by 2030, China's population alone will consume more than the total amount of grain available on the international market.) When Novo Nordisk's researchers leave for Chinese companies, then, they will not end up simply reproducing the work their Western employers gave them. More likely, they will find a way to apply Novo Nordisk's E. coli protein sequencing to Chinese needs—developing inexpensive diabetes medications that can be easily distributed in the countryside, for example.

Economically and diplomatically, China has already positioned itself as a leader of the developing world. Now it is doing the same scientifically, strengthening its own research foundation and honing its expertise. This puts China in the unprecedented position of being a developing country that has resources to call upon. It's country whose business leaders rank among the Fortune 500, and whose biotech and nanotech labs are some of the best in the world, where 130 million people still live below the poverty line. China's rapid economic growth allows it the opportunity to tackle development issues in a way that the West never could. It has the tools to think big and to do it right the first time. And when it comes to the kind of research that will make a difference here—studying pollution reduction and agricultural technologies—the West doesn't feel the same immediacy. This is where science with Chinese characteristics becomes an investment in the future of developing nations, as well as a saleable commodity; this is how China becomes positioned to do no less than shape the future of the five billion people living in the developing world.

But China has to win the home game first. Realizing this, the government has made sure that its science initiatives incorporate the far more intangible and emotional issue of national pride—one of the few forces that can transcend the contradictions of the country and manage to make it feel like a whole, instead of the sum of very disparate parts.

The space-themed Oriental Pearl TV Tower, a vertical string of hot-pink baubles in Shanghai's financial district, is purportedly the tallest tower in Asia, at 1536 feet. Since its completion in 1994, its image has cropped up on everything from cigarette packages to the opening credits of various TV shows. Many now tout it as the symbol of new China. On any given day, tour groups from the Chinese interior and migrant workers who have splurged on the 100 yuan ($12) ticket—a week's pay in rural China—throng the lower bauble, where they wait for as long as two hours to ascend the tower. Eventually, young women dressed like futuristic flight attendants show them to the "space cabin," the uppermost bauble, and leave them to stare out the windows at other skyscrapers or examine the gifts and plaques to China from around the world. The message is two-pronged: China is launching itself to new heights; the world loves China.

The Pearl Tower's long lines are not the only sign that science is fueling Chinese nationalism. China's array of profit-driven tabloids, glossy magazines and Internet portals breathlessly report cloned pigs and new vaccines. Stories abound about what the Shenzhou VI taikonauts will be eating (kungpao chicken and soy-braised beef are among the dozens of new space foods developed for this mission). Children list Yang Liwei, the first man China sent into space, as one of their top 10 heroes, alongside Jackie Chan and Mao Zedong. Dean Cheng, senior Asia analyst at CNA Corporation, a Virginia-based think tank, says he's watching out for a product that will bring scientific advances into average citizens' homes—what he calls the "Chinese equivalent of Tang." China's Nobel laureates for science enjoy celebrity status, even though most of them live abroad. When 82-year-old physics laureate Yang Zhengning married a 28-year-old graduate student last December, he set off a media blitz worthy of a J.Lo marriage.

The importance of pride is difficult to underestimate in a nation that has been waiting for a millennium to reestablish its dominance. Even when China's initiatives follow Western precedents, it makes sure they are bigger and better than anyone else's. China is now home to the world's two largest malls, the tallest hotel and is at work on the tallest building. So when the Chinese sent Yang Liwei into space in 2003, he stayed in orbit longer than John Glenn or Yuri Gagarin—a move that, according to Cheng, has little to do with improvements in technology. "The Chinese firsts are different from other people's firsts," he says, suggesting that the Chinese push to outdo everyone else has more to do with prestige than with scientific fundamentals. "Why are the Chinese doing this now? They want to be viewed as a global player in science."

Of course, these are all elements that the Chinese government can manipulate and control. The tricky part is that, in order for the people's enthusiasm to transfer into a meaningful understanding of science's vital role in their country's development, there needs to be open discussion. So far, the authoritarian Chinese government is fairly uncomfortable with that idea. But if science is to flourish, it may have no choice.

China's breakneck pace of development is creating a slew of problems that cannot be managed without creative scientific thinking. But thinking outside the box is not China's strong suit. Confucianism, long the guiding philosophy of the country, has instilled generations of Chinese with a reverence for learning, family and tradition. It has also bequeathed a rigid, hierarchical education system in which the exchange of knowledge is governed by strict attention to social rules. Chinese universities function like Imperial China's civil service system once did, with intense entrance examinations that are designed to identify the most promising students. Although the system is extremely rigorous, many say that it fails to emphasize creativity and intellectual debate, and fosters a hyper-competitiveness that makes it difficult for students to work in groups. (The emphasis on competition carries over to the selection process for Shenzhou VI, in which two astronauts will be chosen from a pool of six at the last minute. This way, Cheng says, "you're going to keep your astronauts on their toes.") One director of a multinational research and development center here said that whenever possible, he hires Chinese scientists who have studied abroad. His company goes so far as to court Chinese doctoral students from Western universities. "In the Chinese system you learn endless amounts of stuff by heart, but you don't have the discussion that is so important to science," he said.

Orville Schell, a renowned China scholar who is dean of UC Berkeley's Graduate School of Journalism, pointed out that the world's best scientists are deeply interested in other disciplines. Until Chinese scientists enjoy complete intellectual freedom, he said, their thinking will be inhibited. "Read Einstein on War and Peace. This is a truly creative mind in more ways than one. You don't just wake up one morning after 40 or 50 years of Marxism and Leninism and, because the economy is freer, turn into Einstein."

Berkeley neuroscientist Mu-ming Poo, who spends three months a year in Shanghai directing China's Institute of Neuroscience, has been a vocal advocate of institutional reform. The primary obstacle to Chinese scientific progress, he said, is cultural. In the Chinese science world, the Confucian tradition plays out as a top-down administration of funding and assignments, an atmosphere of "undue courtesy" and a dearth of scientists willing to question existing research. In the worst examples, it means that bad science goes unquestioned and seemingly "good" science gets a free ride. In an interview with the Chinese state news agency in August, Harvard mathematician Shing-Tung Yau said that plagiarism is rampant in Chinese academia, noting that a student of his who had plagiarized a professor's article had been given membership to the Chinese Academy of the Sciences and appointed head of a science foundation. But more often, the problem is that the culture prevents the development of new ideas. "Innovation has to come from a spirit of free pursuit," says Poo, "and the Chinese tradition and environment doesn't encourage the free pursuit of one's own ideas."

The returnees, along with their education and their expertise, have brought back with them a greater capacity for the unconventional and the adaptive. And in the ideal scenarios their sensibilities represent a hybrid of their two worlds. When Harvard-educated Zhu Jianhong cultured cells from a chopstick that he extracted from a patient's eye, he was thinking creatively. But, according to his colleagues, he also acted out of obligation—he felt he needed to make his work useful.

The importance of this best-of-both-worlds approach is perhaps best measured against the dire state of China's environment. The country's supply of coal, on which it relies heavily for fuel, will run out by the end of this century. China is home to 16 of the 20 most polluted cities in the world. At the same time, 350 to 400 million people are forecast to migrate from the countryside to cities in the next 25 years, which will vastly increase the already staggering number of cars that China adds to the road each year. Elizabeth Economy, director of Asia Studies at the Council of Foreign Relations says China has thoroughly "degraded" its environment, to an extent unparalleled in the West. Indeed, the total cost of environmental degradation and resource scarcity is widely held to be 8 to 12 percent of GDP each year. Environmentalists worry that while the Chinese government has developed a plan for the sciences that looks ahead 15 years, it has no parallel plan for the environment. China's signing on to Bush's "Kyoto alternative" doesn't inspire much hope among environmentalists here.

But China's rise presents a unique opportunity to change the way development looks, applying solutions that have never been tried in the West. "There is a fundamental difference between China and the US and Europe," Economy said. "China has access to policy approaches and technologies that were not available when the U.S. and Europe were going through similar changes." In some cases, China appropriates these technologies for its own use. The government is starting to show interest in renewable energy; it has plans to quadruple its use of wind power by 2010, with Chinese companies providing most of the technology. In other cases, China serves as a laboratory for ideas that would be difficult to carry out on a large scale anywhere else.

The village of Huangbaiyu, for example, was once like any other northeastern Chinese outpost: a place where people fish, plant corn and tend goats on rolling land surrounded by jagged mountains. Now a joint design team, led on the American side by the architect William McDonough and on the Chinese side by Shanghai's Tongji University, is converting it into a forward-looking experiment in sustainable living. As of August, they had only built one house—a one-story cube with straw walls, a compressed earth frame and a 1000-watt solar panel on its roof—but when construction finishes, close to 200 such houses will be laid out in clusters around Huangbaiyu's school. The village will operate on a closed loop material model, with waste, in the form of biomass, being used to generate methane for fuel or used as compost in the fields.

Huangbaiyu grew out of an unusual 1999 agreement between the Chinese Ministry of Science and Technology and the Oregon state government to work together on issues of sustainability, as the China-U.S. Center for Sustainable Development. Its ultimate goal is to address China's internal migration by building six sustainable urban developments, or New Towns, in cities throughout China. Ultimately, its organizers hope, the New Town approach will be adopted by other developers—abroad as well as in China. "The China-U.S. Center has the potential to be a beacon in terms of demonstrating state-of-the-art environmental technology and thinking that is also cost-effective," said Economy, who serves on the Center's board of directors. "The difficulty that the idea will encounter is how do you encourage replication?" The answer would seem to be: with success. The more science demonstrates that is is an integral tool of development in China, the more other developing countries see a success story on which they can model their own plan for progress. China thereby reinforces its global position as an innovator and pioneer—the kind that gets noticed, gets imitated and sets the course for everyone else.

An enormous white statue of Li Shizhen, the father of Chinese medicine, stands in the atrium of Shanghai's Research Center for Modernization of Traditional Chinese Medicine, looking solemn but content. He is surrounded by dozens of potted flowers that vaguely suggest an altar. For Chinese policy makers, one of the most obvious—and most Chinese—places to take science is to a discipline with a 2000-year history. The country's next five-year plan will allot one billion yuan ($121 million) toward the development and modernization of traditional medicine. Shanghai unveiled its sparkling modernization center, which houses 2,500 square meters of lab space and 50 full-time researchers, in Zhangjiang Hi-Tech Park, Shanghai's answer to Zhongguancun, last year.

Director De-An Guo, who received his doctorate from Beijing Medical University, recognizes that traditional medicine has an image problem abroad and among younger Chinese at home. His center aims to transform the experienced-based discipline to an evidence-based one through clinical trials and quality control. It is the largest of several facilities around the country that conduct double-blind, placebo-controlled trials for traditional medicine. The next step is internationalization; the center is working with both Chinese and Western companies to develop products for the foreign market. Indeed, Americans spend in the neighborhood of $40 billion on complimentary and alternative medicine every year. As the West explores how to care for the aging baby-boomer generation, these treatment options become ever more attractive.

For China, however, this is an initiative that hits closer to home. Chinese government advisor Wu said that the government's interest in the field comes out of a very practical Chinese concern: how to foot the health care bill for the country's own rapidly aging population. But Ming-Wei Wang, the Cambridge-educated director of the Chinese National Center for Drug Screening, says that developing traditional medicine is actually a costly undertaking. Although the medicines are inexpensive, their effects are not immediately obvious; clinical trials can take years. China's push to develop its medicine, he suggested, has more to do with strengthening the country's national identity. "It shows the nation's well-being is not fully dependent on Western medicine, that we can make a contribution," he said. "It's part of national pride. We can take care of ourselves."

Chinese science's most dramatic reverberations may come from intertwining science with China's national ego. As much as science drives the development of China, science will also remain inherently global and inherently impressionable. The more science becomes a definitive feature of China's identity, the more distinctly these inherent characteristics will manifest elsewhere in Chinese culture. Science cannot exist in a bubble and state control of areas not directly connected to science—the press, the arts, political expression—will increasingly be affected. Scientists have figured prominently in several major democracy movements. Andrei Sahkarov, the physicist who designed the hydrogen bomb, became a leading dissident under the Soviet regime. Similarly, the speeches of Chinese astrophysicist Fang Lizhi helped inspire the Tiananmen Square demonstrations. Science, Fang wrote in 1999, on the 10th anniversary of the Tiananmen massacre, is "a force for rationality, and, from there, democracy."

Perhaps the case can be made that, as goes science in China, so will go China. If science is allowed to lead, China will lead. And if China leads, the world's scientific agenda may change in ways that fundamentally alter everything from the way we eat, to how we treat disease, to what we imagine is even physically possible. We will see a shift in the global scientific agenda and we will witness a change in the way science is performed.

As Zhu Jianhong continues his work at Huashan Hospital, other stem cell researchers are quietly turning out world-class research at other institutions around China—such as Beijing's Li Lingsong, who is experimenting with techniques to produce human organs for use in transplants or Shanghai's Sheng Huizhen, who caused an international stir in 2003 with the creation of a rabbit-human hybrid embryo. These scientists worry about competition from the South Koreans, not about being denied funding or being vilified as immoral or murderous. Their work is considered highly ethical within Chinese culture, where Confucianism dictates that life begins at birth and places an emphasis on the collective—in this case, the patients waiting for treatment—over the individual. If the Chinese stem cell research model is reproduced on a global level, the relationship between science and morality could change drastically; the ethical debate as we know it could possibly be rendered moot, inconsequential on the world stage.

"The failure of the US government in areas of science—stem cells in particular—has left a huge void, an opportunity for the East," says ACT's Lanza. "We're all worried about China's power militarily, but we should be more concerned about it scientifically and culturally—what's happening here could mean the decline of not only our scientific dominance but our cultural dominance."

But science is not a zero-sum game. There are certainly more opportunities than not for Chinese science to compliment, rather than replace, Western science. The real challenge becomes knowing how to answer when opportunity knocks.

© Copyright 2005 Seed Media Group, LLC. All Rights Reserved.

More on prediction markets.

Nature 438, 281 (17 November 2005) | doi:10.1038/438281a

Wisdom of the crowd

Decision makers, wrestling with thorny choices, are tapping into the collective foresight of ordinary people. Jim Giles reports.

Like all of their rivals in the pharmaceutical industry, executives at Eli Lilly routinely need to make tough predictions. How much of a new product do they expect to sell? Is a competing drug going to win approval first?

Millions of dollars of revenue hang on get-ting these predictions right. So it might come as a surprise to learn where some executives at the Indianapolis-based firm have looked for answers: they've been asking readers of USA Today.

Lilly is one of several major corporations now dabbling in 'prediction markets' — decision-making tools that harvest the collective wisdom of groups of ordinary people. Participants are asked to buy and sell shares in real outcomes, and are rewarded for betting on outcomes that turn out to be correct.

Advocates of the prediction markets claim that the predictions of such participants can end up being better than those made by specialists.

"The idea of prediction markets is a powerful one," explains Thomas Malone, a management specialist at the Massachusetts Institute of Technology. "They let many people contribute to the collective assessment of a future event. It's a surprisingly effective way of integrating information."

Lilly runs its markets in conjunction with NewsFutures, a firm based in New York that sells prediction-market software. When the two companies first collaborated in 2003, USA Today was experimenting with the idea as a game for its readers. Lilly subsequently asked 250 of these readers to make predictions about some of its business issues.

The group was invited to buy or sell shares pegged to specific predictions, such as the number of drugs that would be approved in a year by the US Food and Drug Administration. Shares in the correct prediction paid out virtual money at the end of the year, and Lilly motivated traders by stumping up $10,000 in real money to reward the winners.

Although there hasn't been much independent analysis of prediction markets, the data that are available suggest the concept carries promise. Markets aimed at predicting the outcome of US presidential elections and the names of Oscar winners have, according to some assessments, outperformed other indicators such as opinion polls. Emile Servan-Schreiber, NewsFutures' chief executive, says the same pattern emerged with the 2003 Lilly market; outside traders did better in their predictions than company experts.

Supporters say prediction markets motivate participants to carefully study information related to whatever they are trading in. And as outsiders, they lack the vested interest that can colour the predictions of internal staff, or even some consultants, who might, for example, have a historical empathy for a particular product line. "The markets provide an interesting counterpart to predictions from groups, such as lobbyists, who tend to see things in black and white," says Servan-Schreiber.

Since its initial study, Lilly has commissioned two further prediction markets, the second of which is running this year and using its own sales staff as participants. In this market, predicted outcomes, such as the revenue that a specific Lilly drug will achieve in different quarters of the year, are assigned a price: a trader might buy a share in that outcome if, for example, they think the share price underestimates the outcome at the end of the quarter. Lilly declined to discuss the details of this market, but Servan-Schreiber claims that it has already outperformed standard internal forecasts.

Rival firm Abbott Laboratories, based in Illinois, has also purchased prediction-market software from NewsFutures, but declined to say how this is being used. And similar schemes are finding myriad uses elsewhere. The University of Iowa in Iowa City, for example, has set up a prediction market to help health authorities assess how the state will be affected by influenza in the coming year.

But how much trust can be placed in these tools? Much of the enthusiasm for the concept has been driven by the publicity surrounding their use as predictors of the outcomes of US presidential elections. But the data from these are slim, and corporate leaders may be sceptical about what prediction markets are really worth.

Charles Manski, an economist at Northwestern University in Illinois, has looked into the markets' performance, and takes issue with one of their assumptions, namely that traders' beliefs alone determine market price. He thinks that the price is also affected by differences in traders' budgets and attitudes to risk, and that unless these other two influences are understood, predictions will be misinterpreted. Manski warns against believing all the claims made about the approach. "The problem with prediction markets is that they've been hyped by the kinds of people who believe that markets solve all problems," he says.

Malone concedes that the markets still need to earn their spurs. But he stresses that information from them should at least be considered — especially when it runs counter to mainstream thinking. "At a minimum, you should do more investigation to find out why the market said what it did," he suggests. Once firms start to see the prediction market as just another tool, it'll become an everyday part of business management, he predicts. "Information technology has allowed the cost of doing this to fall to almost zero," Malone says. "It will become routine."

The time of a thousand words has long since passed.

Fall 2005

The Image Culture

Christine Rosen

When Hurricane Katrina struck the Gulf Coast of Mississippi, Alabama, and Louisiana in late August, images of the immense devastation were immediately available to anyone with a television set or an Internet connection. Although images of both natural and man-made disasters have long been displayed in newspapers and on television, the number and variety of images in the aftermath of Katrina reveals the sophistication, speed, and power of images in contemporary American culture. Satellite photographs from space offered us miniature before and after images of downtown New Orleans and the damaged coast of Biloxi; video footage from an array of news outlets tracked rescue operations and recorded the thoughts of survivors; wire photos captured the grief of victims; amateur pictures, taken with camera-enabled cell phones or digital cameras and posted to personal blogs, tracked the disaster’s toll on countless individuals. The world was offered, in a negligible space of time, both God’s-eye and man’s-eye views of a devastated region. Within days, as pictures of the squalor at the Louisiana Superdome and photographs of dead bodies abandoned in downtown streets emerged, we confronted our inability to cope with the immediate chaos, destruction, and desperation the storm had caused. These images brutally drove home the realization of just how unprepared the U.S. was to cope with such a disaster.

But how did this saturation of images influence our understanding of what happened in New Orleans and elsewhere? How did the speed with which the images were disseminated alter the humanitarian and political response to the disaster? And how, in time, will these images influence our cultural memory of the devastation caused by Hurricane Katrina?

Such questions could be asked of any contemporary disaster—and often have been, especially in the wake of the September 2001 terrorist attacks in New York and Washington, D.C., which forever etched in public memory the image of the burning Twin Towers. But the average person sees tens of thousands of images in the course of a day. One sees images on television, in newspapers and magazines, on websites, and on the sides of buses. Images grace soda cans and t-shirts and billboards. “In our world we sleep and eat the image and pray to it and wear it too,” novelist Don DeLillo observed. Internet search engines can instantly procure images for practically any word you type. On, a photo-sharing website, you can type in a word such as “love” and find amateur digital photos of couples in steamy embrace or parents hugging their children. Type in “terror” and among the results is a photograph of the World Trade Center towers burning. “Remember when this was a shocking image?” asks the person who posted the picture.

The question is not merely rhetorical. It points to something important about images in our culture: They have, by their sheer number and ease of replication, become less magical and less shocking—a situation unknown until fairly recently in human history. Until the development of mass reproduction, images carried more power and evoked more fear. The second of the Ten Commandments listed in Exodus 20 warns against idolizing, or even making, graven images: “Thou shalt not make unto thee any graven image, or any likeness of any thing that is in heaven above, or that is in the earth beneath, or that is in the water under the earth.” During the English Reformation, Henry VIII’s advisor Thomas Cromwell led the effort to destroy religious images and icons in the country’s churches and monasteries, and was successful enough that few survive to this day. The 2001 decision by the Taliban government in Afghanistan to destroy images throughout the country—including the two towering stone Buddhas carved into the cliffs of Bamiyan—is only the most recent example of this impulse. Political leaders have long feared images and taken extreme measures to control and manipulate them. The anonymous minions of manipulators who sanitized photographs at the behest of Stalin (a man who seemingly never met an enemy he didn’t murder and then airbrush from history) are perhaps the best known example. Control of images has long been a preoccupation of the powerful.

It is understandable why so many have been so jealous of the image’s influence. Sight is our most powerful sense, much more dominant in translating experience than taste, touch, or hearing. And images appeal to emotion—often viscerally so. They claim our attention without uttering a word. They can persuade, repel, or charm us. They can be absorbed instantly and easily by anyone who can see. They seem to speak for themselves.

Today, anyone with a digital camera and a personal computer can produce and alter an image. As a result, the power of the image has been diluted in one sense, but strengthened in another. It has been diluted by the ubiquity of images and the many populist technologies (like inexpensive cameras and picture-editing software) that give almost everyone the power to create, distort, and transmit images. But it has been strengthened by the gradual capitulation of the printed word to pictures, particularly moving pictures—the ceding of text to image, which might be likened not to a defeated political candidate ceding to his opponent, but to an articulate person being rendered mute, forced to communicate via gesture and expression rather than language.

Americans love images. We love the democratizing power of technologies—such as digital cameras, video cameras, Photoshop, and PowerPoint—that give us the capability to make and manipulate images. What we are less eager to consider are the broader cultural effects of a society devoted to the image. Historians and anthropologists have explored the story of mankind’s movement from an oral-based culture to a written culture, and later to a printed one. But it is only in the past several decades that we have begun to assimilate the effects of the move from a culture based on the printed word to one based largely on images. In making images rather than texts our guide, are we opening up new vistas for understanding and expression, creating a form of communication that is “better than print,” as New York University communications professor Mitchell Stephens has argued? Or are we merely making a peculiar and unwelcome return to forms of communication once ascendant in preliterate societies—perhaps creating a world of hieroglyphics and ideograms (albeit technologically sophisticated ones)—and in the process becoming, as the late Daniel Boorstin argued, slavishly devoted to the enchanting and superficial image at the expense of the deeper truths that the written word alone can convey?

Two things in particular are at stake in our contemporary confrontation with an image-based culture: First, technology has considerably undermined our ability to trust what we see, yet we have not adequately grappled with the effects of this on our notions of truth. Second, if we are indeed moving from the era of the printed word to an era dominated by the image, what impact will this have on culture, broadly speaking, and its institutions? What will art, literature, and music look like in the age of the image? And will we, in the age of the image, become too easily accustomed to verisimilar rather than true things, preferring appearance to reality and in the process rejecting the demands of discipline and patience that true things often require of us if we are to understand their meaning and describe it with precision? The potential costs of moving from the printed word to the image are immense. We may find ourselves in a world where our ability to communicate is stunted, our understanding and acceptance of what we see questionable, and our desire to transmit culture from one generation to the next seriously compromised.

The Mirror With a Memory

The creator of one of the earliest technologies of the image named his invention, appropriately enough, for himself. Louis-Jacques-Mandé Daguerre, a Frenchman known for his elaborate and whimsical stage design in the Paris theater, began building on the work of Joseph Nicéphore Niepce to try to produce a fixed image. Daguerre called the image he created in 1837 the “daguerreotype” (acquiring a patent from the French government for the process in 1839). He made extravagant claims for his device. It is “not merely an instrument which serves to draw nature,” he wrote in 1838, it “gives her the power to reproduce herself.”

Despite its technological crudeness and often-spectral images, the daguerreotype was eerily effective at capturing glimmers of personality in its fixed portraits. The extant daguerreotypes of well-known Americans in the nineteenth century include: a young and serious Abraham Lincoln, sans beard; an affable Horace Greeley in stovepipe hat; and a dour picture of the suffragist Lucy Stone. A daguerreotype of Edgar Allen Poe, taken in 1848, depicts the writer with a baleful expression and crossed arms, and was taken not long before Poe was found delirious and near death on the streets of Baltimore.

But the daguerreotype did more than capture the posture of a poised citizenry. It also changed artists’ perceptions of human nature. Nathaniel Hawthorne’s 1851 Gothic romance, The House of the Seven Gables, has an ancient moral (“the wrong-doing of one generation lives into the successive ones”) but made use of a modern technology, daguerreotyping, to unspool its story about the unmasking of festering, latent evil. In the story, Holgrave, the strange lodger living in the gabled house, is a daguerreotypist (as well as a political radical) who says of his art: “While we give it credit only for depicting the merest surface, it actually brings out the secret character with a truth no painter would ever venture upon, even could he detect it.” It is Holgrave’s silvery daguerreotypes that eventually reveal the nefarious motives of Judge Pyncheon—and in so doing suggest that the camera could expose human character more acutely than the eye.

Oliver Wendell Holmes called the photo the “mirror with a memory,” and in 1859 predicted that the “image would become more important than the object itself and would in fact make the object disposable.” But praise for the photograph was not universal. “A revengeful God has given ear to the prayers of this multitude. Daguerre was his Messiah,” said the French poet Charles Baudelaire in an essay written in 1859. “Our squalid society rushed, Narcissus to a man, to gaze at its trivial image on a scrap of metal.” As a result, Baudelaire worried, “artistic genius” was being impoverished.

Contemporary critiques of photography have at times echoed Baudelaire’s fear. In her elegant extended essay, On Photography, the late Susan Sontag argues that images—particularly photographs—carry the risk of undermining true things and genuine experiences, as well as the danger of upending our understanding of art. “Knowing a great deal about what is in the world (art, catastrophe, the beauties of nature) through photographic images,” Sontag notes, “people are frequently disappointed, surprised, unmoved when they see the real thing.” This is not a new problem, of course; it plagued the art world when the printing process allowed the mass reproduction of great works of art, and its effects can still be seen whenever one overhears a museum-goer express disappointment that the Van Gogh he sees hanging on the wall is nowhere near as vibrant as the one on his coffee mug.

But Sontag’s point is broader, and suggests that photography has forced us to consider that exposure to images does not necessarily create understanding of the things themselves. Images do not necessarily lead to meaning; the information they convey does not always lead to knowledge. This is due in part to the fact that photographic images must constantly be refreshed if one’s attention is to continue to be drawn to them. “Photographs shock insofar as they show something novel,” Sontag argues. “Unfortunately, the ante keeps getting raised—partly through the very proliferation of such images of horror.” Images, Sontag concludes, have turned the world “into a department store or museum-without-walls,” a place where people “become customers or tourists of reality.”

Other contemporary critics, such as Roger Scruton, have also lamented this diversionary danger and worried about our potential dependence on images. “Photographic images, with their capacity for realization of fantasies, have a distracting character which requires masterly control if it is not to get out of hand,” Scruton writes. “People raised on such images ... inevitably require a need for them.” Marshall McLuhan, the Sixties media guru, offered perhaps the most blunt and apt metaphor for photography: he called it “the brothel-without-walls.” After all, he noted, the images of celebrities whose behavior we so avidly track “can be bought and hugged and thumbed more easily than public prostitutes”—and all for a greatly reduced price.

Nevertheless, photographs still retain some of the magical allure that the earliest daguerreotypes inspired. As W. J. T. Mitchell observes in What Do Pictures Want?, “When students scoff at the idea of a magical relation between a picture and what it represents, ask them to take a photograph of their mother and cut out the eyes.” As objects, our photographs have changed; they have become physically flimsier as they have become more technologically sophisticated. Daguerre produced pictures on copper plates; today many of our photographs never become tangible things, but instead remain filed away on computers and cameras, part of the digital ether that envelops the modern world. At the same time, our patience for the creation of images has also eroded. Children today are used to being tracked from birth by digital cameras and video recorders and they expect to see the results of their poses and performances instantly. “Let me see,” a child says, when you take her picture with a digital camera. And she does, immediately. The space between life as it is being lived and life as it is being displayed shrinks to a mere second. Yet, despite these technical developments, photographs remain powerful because they are reminders of the people and things we care about. They are surrogates carried into battle by a soldier or by a traveler on holiday. They exist to remind us of the absent, the beloved, and the dead. But in the new era of the digital image, they also have a greater potential for fostering falsehood and trickery, perpetuating fictions that seem so real we cannot tell the difference.

Vanishing Commissars and Bloodthirsty Presidents

Human nature being what it is, little time passed after photography’s invention before a means for altering and falsifying photographs was developed. A German photographer in the 1840s discovered a way to retouch negatives, Susan Sontag recounts, and, perversely if not unpredictably, “the news that the camera could lie made getting photographed much more popular.”

One of the most successful mass manipulators of the photographic image was Stalin. As David King recounts in his riveting book, The Commissar Vanishes: The Falsification of Photographs and Art in Stalin’s Russia, image manipulation was the extension of Stalin’s paranoiac megalomania. “The physical eradication of Stalin’s political opponents at the hands of the secret police was swiftly followed by their obliteration from all forms of pictorial existence,” King writes. Airbrush, India ink, and scalpel were all marshaled to remove enemies such as Trotsky from photographs. “There is hardly a publication from the Stalinist period that does not bear the scars of this political vandalism,” King concludes.

Even in non-authoritarian societies, early photo falsification was commonly used to dupe the masses. A new exhibit at the Metropolitan Museum of Art in New York, “The Perfect Medium: Photography and the Occult,” displays a range of photographs from the late-nineteenth- and early-twentieth-century United States and Europe that purport to show ghosts, levitating mediums, and a motley array of other emanations that were proffered as evidence of the spirit world by devotees of the spiritualism movement popular at the time. The pictures, which include images of tiny heads shrouded in smoke and hovering over the furrowed brows of mediums, and ghosts in diaphanous robes walking through gardens, are “by turns spooky, beautiful, disturbing, and hilarious,” notes the New York Times. They create “visual records of decades of fraud, cons, flimflams and gullibility.”

Stalin and the spiritualists were not the only people to manipulate images in the service of reconstructing the past—many an angry ex-lover has taken shears to photos of a once-beloved in the hope that excising the images might also excise the bad memories the images prompt. But it was the debut of a computer program called Photoshop in 1990 that allowed the masses, inexpensively and easily, to begin rewriting visual history. Photoshop and the many copycat programs that have followed in its wake allow users to manipulate digital images with great ease—resizing, changing scale, and airbrushing flaws, among other things—and they have been both denounced for facilitating the death of the old-fashioned darkroom and hailed as democratic tools for free expression. “It’s the inevitable consequence of the democratization of technology,” John Knoll, the inventor of Photoshop, told “You give people a tool, but you can’t really control what they do with it.”

For some people, of course, offering Photoshop as a tool is akin to giving a stick of dynamite to a toddler. Last year, The Nation published an advertisement that used Photoshop to superimpose President Bush’s head over the image of a brutal and disturbing Richard Serra sculpture (which itself borrows from Goya’s painting, “Saturn Devouring One of His Children”) so that Bush appeared to be enthusiastically devouring a naked human torso. In contrast to the sickening image, the accompanying text appears prim: As this and other images suggest, Photoshop has introduced a new fecklessness into our relationship with the image. We tend to lose respect for things we can manipulate. And when we can so readily manipulate images—even images of presidents or loved ones—we contribute to the decline of respect for what the image represents.

Photoshop is popular not only because it allows us visually to settle scores, but also because it appeals to our desire for the incongruous (and the ribald). “Photoshop contests” such as those found on the website offer people the opportunity to create wacky and fantastic images that are then judged by others in cyberspace. This is an impulse that predates software and whose most enthusiastic American purveyor was, perhaps, P. T. Barnum. In the nineteenth century, Barnum barkered an infamous “mermaid woman” that was actually the moldering head of a monkey stitched onto the body of a fish. Photoshop allows us to employ pixels rather than taxidermy to achieve such fantasies, but the motivation for creating them is the same—they are a form of wish fulfillment and, at times, a vehicle for reinforcing our existing prejudices.

Of course, Photoshop meddling is not the only tactic available for producing misleading images. Magazines routinely airbrushed and retouched photographs long before picture-editing software was invented. And of course even “authentic” pictures can be staged, like the 1960s Life magazine pictures of Muhammad Ali that showed him training underwater; in fact, Ali couldn’t even swim, and he hadn’t done any underwater training for his prizefights before stepping into the pool for that photo opportunity. More recently, in July 2005, the New York Times Magazine raised eyebrows when it failed to disclose that the Andres Serrano photographs accompanying a cover story about prisoner interrogation were in fact staged images rather than straightforward photojournalism. (Serrano was already infamous for his controversial 1989 photograph, “Piss Christ.”) The Times public editor chastised the magazine for violating the paper’s guidelines that “images in our pages that purport to depict reality must be genuine in every way.”

But while Photoshop did not invent image fraud, it has made us all potential practitioners. It enables the average computer user to become a digital prankster whose merrymaking with photographs can create more than silly images—it can spawn political and social controversy. In a well-reported article published in in 2004, Farhad Manjoo explored in depth one such controversy: an image that purportedly showed an American Marine reservist in Iraq standing next to two young boys. One boy held a cardboard sign that read, “Lcpl Boudreaux killed my Dad then he knocked up my sister!” When the image found its way to the Council on American-Islamic Relations (CAIR), Manjoo reports, it seemed to prove the group’s worst fears about the behavior of American soldiers in Iraq. An angry press release soon followed. But then another image surfaced on various websites, identical to the first except for the text written on the cardboard sign, which now read, “Lcpl Boudreaux saved my Dad then he rescued my sister!” The authenticity of both photos was never satisfactorily proven, and, as Manjoo notes, the episode serves as a reminder that in today’s Photoshop world, “pictures are endlessly pliable.” (Interestingly, CAIR found itself at the center of a recent Photoshop scandal, the Weekly Standard reported, when it was shown that the organization had Photoshopped a hijab, or headscarf, onto several women in a picture taken at a CAIR event and then posted the doctored image on the organization’s website.)

Just as political campaigns in the past produced vituperative pamphlets and slogans, today Photoshop helps produce misleading images. The Bush-Cheney campaign was pilloried for using a Photoshopped image of a crowd of soldiers in the recent presidential election; the photo duplicated groups of soldiers to make the crowd appear larger than it actually was. The replicated faces of the soldiers recalled an earlier and cruder montaged crowd scene, “Stalin and the Masses,” produced in 1930, which purported to show the glowering dictator, in overcoat and cap, standing before a throng of loyal communists. (Other political campaigns—and university publicity departments—have also reportedly resorted to using Photoshop on pictures to make them seem more racially diverse.) Similarly, a Seventies-era image of Jane Fonda addressing an anti-war crowd with a young and raptly admiring John Kerry looking on was also created with Photoshop sorcery but circulated widely on the Internet during the last presidential election as evidence of Kerry’s extreme views. The doctored image fooled several news outlets before its questionable provenance was revealed. (Another image of Kerry and Fonda, showing them both sitting in the audience in a 1970 anti-war rally, was authentic.)

Photoshop, in effect, democratizes the ability to commit fraud. As a result, a few computer programmers are creating new digital detection techniques to uncover forgeries and manipulations. The Inspector Javert of digital fraud is Dartmouth computer science professor Hany Farid, who developed a software program that analyzes the pattern of pixels in digital images. Since all digital pictures are, in essence, a collection of codes, Farid’s program ferrets out “abnormal patterns of information that, while invisible to the eye, are detectable by computer” and that represent possible tampering, according to the New York Times. “It used to be that you had a photograph, and that was the end of it—that was truth,” Farid said last July. “We’re trying to bring some of that back. To put some measure of guarantee back in photography.”

But the digital manipulation of images can also be employed for far more enlightened purposes than removing models’ blemishes and attacking political opponents. Some artists use Photoshop merely to enhance photographs they take; others have made digital editing a central part of their art. The expansive images of the German photographer Andreas Gursky, whose photos of Montparnasse, the Tokyo Stock Exchange, and a 99-cent store make use of digital alteration, prompt us to look at familiar spaces in unfamiliar ways. The portraits taken and Photoshopped by artist Loretta Lux are “mesmerizing images of children who seem trapped between the nineteenth and twenty-first centuries, who don’t exist except in the magical realm of art,” according to a New York Times critic. Here the manipulation of the image does not intrude. It illuminates. In these pictures, the manipulation of the image at least serves an authentic artistic vision, a vision that relies on genuine aesthetic and critical standards. Ironically, it is these very standards that a culture devoted to the image risks compromising.

The MTV Effect

The still images of daguerreotyping and photography laid the groundwork for the moving image in film and video; as photography did before them, these technologies prompted wonder and sweeping claims about the merits of this new way of seeing. In 1915, after a screening of filmmaker D. W. Griffith’s The Birth of a Nation, Woodrow Wilson declared that it was “like writing history with lightning” (a judgment Griffith promptly began using in his promotional efforts for the film). Moving images are as powerful as photos, if not more so. Like photographs, they appeal to emotion and can be read in competing ways. Yet moving images change so rapidly and so often that they arrest our attention and task the brain’s ability to absorb what we are seeing. They are becoming a ubiquitous presence in public and private life—so much so that Camille Paglia, an astute critic of images, has called our world “a media starscape of explosive but evanescent images.”

The moving image, like the photograph, can also be marshaled to prove or disprove competing claims. During the legal and political debate surrounding the case of Terri Schiavo, for example, videotape of her movements and apparent responsiveness to loved ones became central in this family dispute-turned-national drama. Those who argued for keeping Schiavo alive used the footage as evidence that she did indeed have feelings and thoughts that rendered attempts to remove her feeding tube barbaric and immoral. Those who believed that she should be left to die (including her husband) thought the tape “grossly deceptive,” because it represented a misleading portrait of Schiavo’s real condition. Most of the time, her husband and others argued, Terri did not demonstrate awareness; she was “immobile, expressionless.” In the Schiavo case, the moving image was both alibi and accuser.

Most Americans consume moving images through the media of television and movies (and, to a lesser degree, through the Internet and video games). In recent years, in what many observers have called “the MTV effect,” those moving images have become more nimble and less demanding of our attention. Jumping quickly from image to image in hastily edited segments (in some cases as quickly as one image every one-thirtieth of a second), television and, to a lesser extent, movies offer us a constant stream of visual candy. Former Vice President Al Gore’s new for-profit public access television channel, Current TV, is the latest expression of this trend. The network’s website lists its upcoming programming in tiny time increments: “In 1 min,” “In 3 min,” “In 10 min,” and so on. Reviewing the channel’s first few broadcasts, New York Times television critic Alessandra Stanley noted the many techniques “designed to hold short attention spans,” including a “progress bar” at the bottom of the screen that counts down how much time is left for each of the segments—some of which last as little as 15 seconds.

According to enthusiasts of television, the speed and sophistication of moving images allows new and improved forms of oral storytelling that can and should replace staler vehicles like the novel. Video game and television apologist Steven Johnson, author of Everything Bad is Good for You, dreams of a world of “DVD cases lining living room shelves like so many triple-decker novels.” If television is our new form of narrative, then our storytelling skills have declined, as anyone who has watched the new raft of sitcoms and dramas that premiere (and then quickly disappear) each fall on the major networks can attest. (Shows like The Sopranos are perhaps the rare exception.) In fact, television doesn’t really “tell stories.” It constructs fantasy worlds through a combination of images and words, relying more on our visual and aural senses and leaving less to the imagination than oral storytelling does. Writing some years ago in the journal Media & Values, J. Francis Davis noted that although television is in one sense a form of storytelling, the most important messages that emanate from the screen “are those not verbalized—the stories and myths hidden in its constant flow of images.”

It is precisely those hidden stories in the moving image that excite critics like NYU professor Mitchell Stephens. In The Rise of the Image, The Fall of the Word, Stephens argues that the moving image offers a potential cure for the “crisis of the spirit” that afflicts our society, and he is enthusiastic about the fact that “the image is replacing the word as the predominant means of mental transport.” Stephens envisions a future of learning through synecdoche, using vivid and condensed images: “A half second of the Capitol may be enough to indicate the federal government, a quick shot of a white-haired woman may represent age. The part, in other words, will be substituted for the whole so that in a given period of time it will be possible to consider a larger number of wholes.” He quotes approvingly the prediction of movie director Ridley Scott, who declares: “Film is twentieth-century theater, and it will become twenty-first-century writing.”

Perhaps it will. But Stephens, like other boosters of the image, fails to acknowledge what we will lose as well as gain if this revolution succeeds. He says, for example, “our descendants undoubtedly will still learn to read and write, but they undoubtedly will read and write less often and, therefore, less well.” Language, too, will be “less precise, less subtle,” and books “will maintain a small, elite audience.” This, then, is the future that prompts celebration: a world where, after a century’s effort to make literacy as broadly accessible as possible—to make it a tool for the masses—the ability to read and write is once again returned to the elite. Reading and writing either become what they were before widespread education—a mark of privilege—or else antiquarian preoccupations or mere hobbies, like coin collecting.

Stephens also assumes that the people who will be absorbing these images will have a store of knowledge at their disposal with which to interpret them. A quick shot of a white-haired woman might effectively be absorbed as symbolizing “age” to one person, as Stephens says, but it could also reasonably prompt ideas such as “hair dye,” “feebleness,” or “Social Security” to another. As Camille Paglia observes of her own students, “young people today are flooded with disconnected images but lack a sympathetic instrument to analyze them as well as a historical frame of reference in which to situate them.” They lack, in other words, a shared language or lexicon that would allow them to interpret images and then communicate an understanding of what they are seeing.

Such a deficit will pose a unique challenge for cultural transmission from one generation to the next. How, in Stephens’s future world of the moving image, will history, literature, and art be passed down to the next generation? He might envision classrooms where children watch the History Channel rather than pore over dull textbooks. But no matter how much one might enjoy the BBC’s televised version of Pride and Prejudice, it is no substitute for actually reading Austen’s prose, nor is a documentary about the American Constitutional Convention as effective at distilling the political ideals of the early American republic as reading The Federalist Papers. Moving images are a rich aide to learning and understanding but their victory as the best means of forming rigorous habits of mind is by no means assured.

In addition, Stephens accepts uncritically the claim that the “old days” of written and printed culture are gone (or nearly so) and assumes that video is the language that has emerged, like some species evolving through a process of natural selection, to take its place in the culture. He does not entertain the possibility that the reason the moving image is replacing the written word is not because it is, in fact, a superior form for the communication of ideas, but because the moving image—more so than the written word—crudely but intoxicatingly satisfies our desire for stimulation and immediate gratification.

Like any good techno-enthusiast, Stephens takes the choices that we have made en masse as a culture (such as watching television rather than reading), accepts them without challenge, and then declares them inevitable. This is a form of reasoning that techno-enthusiasts often employ when they attempt to engage the concerns of skeptics. Although rhetorically useful in the short-term, this strategy avoids the real questions: Did things have to happen this way rather than that way? Does every cultural trend make a culture genuinely better? By neglecting to ask these questions, the enthusiast becomes nearly Panglossian in his hymns to his new world.

There is, of course, a long and thorough literature critical of television and the moving image, most notably the work of Neil Postman, Jerry Mander, and Marie Winn. And as with photography, from its earliest days there have been those who worried that television might undermine our appreciation for true things. “Television hangs on the questionable theory that whatever happens anywhere should be sensed everywhere,” E. B. White wrote in The New Yorker in 1948. “If everyone is going to be able to see everything, in the long run all sights may lose whatever rarity value they once possessed, and it may well turn out that people, being able to see and hear practically everything, will be specially interested in almost nothing.” Others are even blunter. As Roger Scruton writes, “Observing the products of the video culture you come to see why the Greeks insisted that actors wear masks, and that all violence take place behind the scenes.” It is possible, in other words, to see too much, and in the seeing lose our grasp on what is real. Television is the perfect vehicle for this experience, since it bombards us with shocking, stimulating, and pleasant images, all the while keeping us at a safe remove from what we are seeing.

But the power the moving image now exercises over modern American life has grown considerably in recent years. It is as if the Jumbotron television screen that looms over Times Square in New York has replicated and installed itself permanently in public space. Large screens broadcasting any number of images and advertisements can be found in most sports arenas, restaurants, and shopping malls; they even appear in a growing number of larger churches. The dentist’s and doctor’s office are no longer safe havens from a barrage of images and sounds. A walk through an airport terminal is now a gauntlet of moving images, as televisions bolted into ceilings or walls blare vacuous segments from CNN’s dedicated “airport programming”; once on board a plane, we’re treated to nonstop displays of movies and TV options like “NBC In Flight.” The ubiquity of television sets in public space is often explained as an attempt to entertain and distract, but in fact it seems more successful at annoyance or anesthetization. For people who wish to travel, eat, or pray in silence, there are few options beyond the deliciously subversive “TV-B-Gone” device, a universal remote control the size of a key chain that allows users to turn off televisions in public places. Considering the number of televisions currently in use, however, it would take an army of TV-B-Gone users to restore peace and quiet in public space.

One of the more startling developments in recent years is the moving image’s interjection into the classical concert hall. In 2004, the New York Philharmonic experimented with a 15-by-20-foot screen that projected enormous images of the musicians and conductor to the audience during performances of Wagner and Brahms. The orchestra trustee who encouraged the project was blunt about his motivation: “We want to increase attendance at concerts, change the demographics,” he told the New York Times. “And the younger generation is more responsive to visual stimuli.” A classical music industry consultant echoed the sentiment. “We have to recognize that this is a visual generation,” he said. “They are used to seeing things more than they are used to hearing things.” Symphonies in Vancouver, San Diego, Omaha, Atlanta, and Philadelphia have all tried using moving images during concerts, and some orchestras are resorting to gimmicks such as projecting works of art during performances of Mussorgsky’s “Pictures at an Exhibition,” or broadcasting images of space during Holst’s “The Planets.”

Among those less than pleased with the triumph of the moving image in the concert hall are the musicians themselves, who are haplessly being transformed into video stars. “I found it very distracting,” a violinist with the New York Philharmonic said. “People might as well stay home with their big-screen TVs,” said another resignedly. “It’s going the route of MTV, and I’m not sure it’s the way to go.” What these musicians are expressing is a concern for the eclipse of their music, which often requires discipline and concentration to appreciate, by imagery. The images, flashing across a large screen above their heads, demand far less of their audience’s active attention than the complicated notes and chords, rhythms and patterns, coming from their instruments. The capitulation of the concert hall to the moving image suggests that in an image-based culture, art will only be valuable insofar as it can be marketed as entertainment. The moving image redefines all other forms of expression in its image, often leaving us impoverished in the process.

Brain Candy

Concern about the long-term effects of being saturated by moving images is not merely the expression of quasi-Luddite angst or cultural conservatism. It has a basis in what the neurosciences are teaching us about the brain and how it processes images. Images can have a profound physiological impact on those who view them. Dr. Steven Most, a postdoctoral fellow at Yale University, recently found that graphic images can “blind” us by briefly impairing the brain, often for as long as one-fifth of a second. As his fellow researcher explained to Discovery News: “Brain mechanisms that help us to attend to things become tied up by the provocative image, unable to orient to other stimuli.”

Another study by researchers at the Center for Cognitive Science at Ohio State University found that, for young children, sound was actually more riveting than images—overwhelmingly so, in some cases. The research findings, which were published in Child Development, showed that “children seem to be able to process only one type of stimuli at a time” and that “for infants, sounds are preferred almost exclusively,” a preference that continues up until at least age four. In their book Imagination and Play in the Electronic Age, Dorothy and Jerome Singer argue that “the electronic media of television, film and video games now may contribute to the child’s development of an autonomous ongoing consciousness but with particular constraints. Looking and listening alone without other sensory inducements,” they write, “can be misleading guides to action.”

Research into the function of the primary visual cortex region of the brain suggests that it is not alarmist to assume that constant visual stimulation of the sort broadcast on television might have profound effects on the brains of children, whose neurological function continues to develop throughout childhood and adolescence. One study conducted at the University of Rochester and published in the journal Nature in 2004, involved, weirdly enough, tracking the visual processing patterns of ferrets that were forced to watch the movie The Matrix. The researchers found some surprising things: The adult ferrets “had neural patterns in their visual cortex that correlated very well with images they viewed,” according to a summary of the research, “but that correlation didn’t exist at all in very young ferrets, suggesting the very basis of comprehending vision may be a very different task for young brains versus old brains.” The younger ferrets were “taking in and processing visual stimuli” just like the adult ferrets, but they were “not processing the stimuli in a way that reflects reality.”

These kinds of findings have led to warnings about the long-term negative impact of moving images on young minds. A study published in 2004 in the journal Pediatrics, for example, found a clear link between early television viewing and later problems such as attention deficit/hyperactivity disorder, and recent research has suggested troubling, near-term effects on behavior for young players of violent video games. In short: Moving images—ubiquitous in homes and public spaces—pose challenges to healthy development when they become the primary object of children’s attention. Inculcating the young into the image culture may be bad for their brains.

The Closing of the PowerPoint Mind

A culture that raises its children on the milk of the moving image should not be surprised when they prove unwilling to wean themselves from it as adults. Nowhere is the evidence of this more apparent than in the business world, which has become enamored of and obedient to a particular image technology: the computer software program PowerPoint.

PowerPoint, a program included in the popular “Microsoft Office” suite of software, allows users to create visual presentations using slide templates and graphics that can be projected from a computer onto a larger screen for an audience’s benefit. The addition of an “AutoContent Wizard,” which is less a magician than an electronic duenna, helpfully ushers the user through an array of existing templates, suggesting bullet points and summaries and images. Its ease of use has made PowerPoint a reliable and ubiquitous presence at board meetings and conferences worldwide.

In recent years, however, PowerPoint’s reach has extended beyond the business office. People have used PowerPoint slides at their wedding receptions to depict their courtship as a series of “priority points” and pictures. Elementary-school children are using the software to craft bullet-point-riddled book reports and class presentations. As a 2001 story in the New York Times reported, “69 percent of teachers who use Microsoft software use PowerPoint in their classrooms.”

Despite its widespread use, PowerPoint has spawned criticism almost from its inception, and has been called everything from a disaster to a virus. Some claim the program aids sophistry. As a chief scientist at Sun Microsystems put it: “It gives you a persuasive sheen of authenticity that can cover a complete lack of honesty.” Others have argued that it deadens discussion and allows presenters with little to say to cover up their ignorance with constantly flashing images and bullet points. Frustration with PowerPoint has grown so widespread that in 2003, the New Yorker published a cartoon that illustrated a typical job interview in hell. In it, the devil asks his applicant: “I need someone well versed in the art of torture—do you know PowerPoint?”

People subjected endlessly to PowerPoint presentations complain about its oddly chilling effect on thought and discussion and the way the constantly changing slides easily distract attention from the substance of a speaker’s presentation. These concerns prompted Scott McNealy, the chairman of Sun Microsystems, to forbid his employees from using PowerPoint in the late 1990s. But it was the exegesis of the PowerPoint mindset published by Yale emeritus professor Edward Tufte in 2003 that remains the most thorough challenge to this image-heavy, analytically weak technology. In a slim pamphlet titled The Cognitive Style of PowerPoint, Tufte argued that PowerPoint’s dizzying array of templates and slides “weaken verbal and spatial reasoning, and almost always corrupt statistical analysis.” Because PowerPoint is “presenter-oriented” rather than content or audience-oriented, Tufte wrote, it fosters a “cognitive style” characterized by “foreshortening of evidence and thought, low spatial reasoning ... rapid temporal sequencing of thin information ... conspicuous decoration ... a preoccupation with format not content, [and] an attitude of commercialism that turns everything into a sales pitch.” PowerPoint, Tufte concluded, is “faux-analytical.”

Tufte’s criticism of PowerPoint made use of a tragic but effective example: the space shuttle Columbia disaster. When NASA engineers evaluated the safety of the shuttle, which had reached orbit but faced risks upon reentry due to tiles that had been damaged by loose foam during launch, they used PowerPoint slides to illustrate their reasoning—an unfortunate decision that led to very poor technical communication. The Columbia Accident Investigation Board later cited “the endemic use of PowerPoint briefing slides instead of technical papers as an illustration of the problematic methods of technical communication at NASA.” Rather than simply a tool that aids thought, PowerPoint changes the way we think, forcing us to express ourselves in terms of its own functionalities and protocols. As a result, only that which can be said using PowerPoint is worth saying at all.

Pseudo-Events and Pseudo-Culture

Although PowerPoint had not yet been created when he published his book, The Image, in 1961, historian Daniel Boorstin was nevertheless prescient in his warnings about the dangers of a culture that entrusted its rational decision-making to the image. By elevating image over substance and form over content, Boorstin argued that society was at risk of substituting “pseudo-events” for real life and personal image-making for real virtue. (He described in detail new efforts to create public images for the famous and not-so-famous, a process well illustrated by a Canon Camera commercial of several years ago that featured tennis star Andre Agassi insouciantly stating, “Image is everything.”)

“The pseudo-events which flood our consciousness are neither true nor false in the old familiar senses,” Boorstin wrote, but they have created a world “where fantasy is more real than reality, where the image has more dignity than its original.” The result was a culture of “synthetic heroes, prefabricated tourist attractions, [and] homogenized interchangeable forms of art and literature.” Images were wildly popular, Boorstin conceded, but they were, in fact, little different from illusions. “We risk being the first people in history to have been able to make their illusions so vivid, so persuasive, so ‘realistic’ that they can live in them,” he wrote.

Other critics followed Boorstin. In The Disappearance of Childhood, Neil Postman wrote about the way the “electronic and graphic revolutions” launched an “uncoordinated but powerful assault on language and literacy, a recasting of the world of ideas into speed-of-light icons and images.” Images, Postman worried, “ask us to feel, not to think.” French critic Roland Barthes fretted that “the image no longer illustrates the words; it is now the words which, structurally, are parasitic on the image.” In a more recent iteration of the same idea, technology critic Paul Virilio identified a “great threat to the word” in the “evocative power of the screen.” “It is real time that threatens writing,” he noted, “once the image is live, there is a conflict between deferred time and real time, and in this there is a serious threat to writing and to the author.”

Real events are now compared to those of sitcom characters; real tragedies or accidents are described as being “just like a movie” (a practice Susan Sontag first noticed in the 1970s). Even the imagination is often crippled by our image-based culture. For every creative artist (like Gursky) using Photoshop there is a plethora of posturing and shallow artists like Damien Hirst, who once proudly told an interviewer that he spent more time “watching TV than ever I did in the galleries.”

Is it possible to find a balance between naïve techno-enthusiasm for the image culture and the “spirit of bulldog opacity,” as McLuhan described it, which fueled undue skepticism about new technologies in the past? Perhaps devotees of the written word will eventually form a dwindling guild, pensioned off by universities and governments and think tanks to live out their days in quiet obscurity as the purveyors of the image culture expand their reach. But concern about a culture of the image has a rich history, and neither side can yet claim victory. In the preface to his book, The Essence of Christianity, published in 1843, Feuerbach complained that his own era “prefers the image to the thing, the copy to the original, the representation to the reality, appearance to being.”

Techno-enthusiasts are fond of reminding us, as if relating a quaint tale of reason’s triumph over superstition, that new technologies have always stirred controversy. The printing press unnerved the scholastic philosophers and religious scribes whose lives were paced to the tempo of the manuscript; later, the telephone was indicted by a cadre fearful of its threat to conviviality and face-to-face communication, and so on. The laborious copiers of manuscripts did indeed fear the printing press, and some traditionalists did vigorously resist the intrusions of the telephone. But at a time of great social hierarchy, much of this was driven by an elite disdain for the democratizing influence of these technologies and their potential for overturning social conventions (which indeed many of them did). Contemporary criticism of our image-saturated culture is not criticism of the means by which we create images (cameras, television, video). No one would seriously argue for the elimination of such technologies, as those who feared Gutenberg’s invention did when they destroyed printing presses. The critique is an expression of concern about the ends of an image-based culture, and our unwillingness as yet to consider whether those ends might be what we truly want for our society.

Nor is concern about the image culture merely a fear of losing our grip on what is familiar—that known world with its long history of reliance on the printed word. Those copyists who feared the printing press were not wrong to believe that it would render them obsolete. It did. But contemporary critics who question the proliferation of images in culture and who fear that the sheer number of images will undermine the sensibility that creates readers of the written word (replacing them with clever but shallow interpreters of the image) aren’t worried about being usurped by image-makers. They are motivated largely by the hope of preserving what is left of their craft. They are more like the conservationist who has made the forest his home only to discover, to his surprise, that the animals with which he shares it are rapidly dwindling in number. What he wants to know, in his perplexed state, is not “how do I retreat deeper into the forest?” but “how might I preserve the few survivors before all record of them is lost?”

So it is with those who resist an image-based culture. As its boosters suggest, it is here to stay, and likely to grow more powerful as time goes on, making all of us virtual flâneurs strolling down boulevards filled with digital images and moving pictures. We will, of course, be enormously entertained by these images, and many of them will tell us stories in new and exciting ways. At the same time, however, we will have lost something profound: the ability to marshal words to describe the ambiguities of life and the sources of our ideas; the possibility of conveying to others, with the subtlety, precision, and poetry of the written word, why particular events or people affect us as they do; and the capacity, through language, to distill the deeper meaning of common experience. We will become a society of a million pictures without much memory, a society that looks forward every second to an immediate replication of what it has just done, but one that does not sustain the difficult labor of transmitting culture from one generation to the next.

Christine Rosen is a senior editor of The New Atlantis and resident fellow at the Ethics and Public Policy Center. Her new book My Fundamentalist Education: A Memoir of a Divine Girlhood will be published by PublicAffairs Books in January 2006.

Christine Rosen, "The Image Culture," The New Atlantis, Number 10, Fall 2005, pp. 27-46.