Friday, December 02, 2005

The 7 Habits of Highly Successful People.

Parents, back off - redux.

The problem with paranoid parents

A modern child's life is filled with unnecessary monitoring and mollycoddling from over-protective parents, says writer John O'Farrell, who sets out his opinion in the BBC Two series Backlash.

The physics of parenthood have exploded.

Once the kids were satellites orbiting around the parents; now the centre of the universe is the child.

Mothers feel guilty leaving their children to watch television on their own, so sit down and watch Pingu beside them, wasting valuable time that could be far better spent sitting in the kitchen smoking and doing Su Doku puzzles.

Parents volunteer to go in and read in the classroom, when all they really want to do is spy on the teachers and be with their precious ones during school hours as well.

When I was a child my parents did not surrender their dignity by wallowing around in ball pits.

They went to the pub and left me and my brother on our own fighting in the car.

Sitting in that pub car park taught me important lessons. I learnt what happens when you release the hand-brake on a hill. But of course I also used that time to read. I can still quote the AA Members handbook from 1968.

Lost confidence

Just as you see toddlers being restrained by those ludicrous safety reins, modern parents are wearing invisible reins that hold them back from doing what ought to come naturally.

Manuals are consulted, diet fads are imposed, each scare story in the tabloids has parents changing the regime under which their kids are being brought up.

Parents have lost the confidence to trust themselves or others. Fear has become the dominant emotion - both the fear of something happening and fear of nothing happening to them; the terror that their children might be ordinary.

And so every second of the modern child's life is time-tabled and monitored.

Children are strapped into the back of 4x4s and whisked from this tutor to that, and if there are a few minutes of mucking about in the park, the play is under the constant supervision of the Meercat Mums.

So children are never bored, they never learn how to fill their own time, they never discover things for themselves.

I am in favour of children being bored. In fact I think we need a Boredom Tsar (I suggest my old geography teacher).

Learning responsibility

And although the children are in no danger of falling from the climbing frame (because both parents are underneath with their arms outstretched waiting to catch them) we have no idea what damage is being done inside.

Children are being denied the chance to learn initiative and independence; they are not learning to take responsibility for their own actions.

In 30 years' time the prime minister will be saying: "Mum, can you do this for me?"

We should force ourselves to set our children free. They should walk to school on their own, go to the park with their mates and kick a ball about and climb trees that do not have rubber matting underneath.

The trouble is, we have made children so paranoid that if anyone suggested this to them, the kids would run a mile.

Or rather their parents would drive them.

John O'Farrell presents Backlash: Paranoid Parents, to be shown on BBC Two on Saturday, 3 December, 2005, at 1840 GMT.

From musical darling to media mogul.

November 28, 2005

David Carr

Media Age Business Tips From U2

IN pop culture, nothing lasts forever. But U2 is coming close.

On the surface, the formula U2 used to send 20,000 fans into sing-along rapture at Madison Square Garden last Tuesday night was as old as rock 'n' roll: four blokes, three instruments, a bunch of good songs. Add fans, cue monstrous sound system, light fuse and back away.

But that does not explain why, 25 years in, four million people will attend 130 sold-out shows this year and next that will gross over $300 million and how their most recent album, "How to Dismantle an Atomic Bomb," has already sold eight million copies.

For that, you have to look at U2 less as a band than as a multimillion-dollar, multinational media company, one of the smarter ones around.

"We always said it would be pathetic to be good at the music and bad at the business," said Paul McGuinness, the band's manager since the beginning. And while U2 hasn't become a Harvard Business School case study (at least not yet) it offers an object lesson in how media can connect with their customers.

MEET THE CONSUMERS WHERE THEY LIVE For years, the U2 fanzine Propaganda was used to feed the tribe. The band's Web presence was restricted to temporary sites for specific tours. But in 2000, U2 opened an extensive Web site, with an index to every song and album, lyrics, tour news that is refreshed nightly and subscriber features - for those die-hards willing to part with $40 - that allowed them access to tickets, exclusive content and streaming downloads of every song and video the band has ever made.

APOLOGIZE, THEN MOVE ON With the Vertigo tour, it became apparent that some of those fans who had paid good money to join U2's Web site had been elbowed aside by scalpers in the scrum for tickets. The band's response was to apologize immediately and promise to do better.

"The idea that our longtime U2 fans and scalpers competed for U2 tickets through our own Web site is appalling to me," the drummer Larry Mullen wrote in a statement issued by the band as soon as the problem arose. "I want to apologize to you who have suffered that."

EMBRACE TECHNOLOGY While other big acts were scolding and threatening fans for downloading music or, in the case of Metallica, suing Napster, U2 was busy working on a new business model.

A collaboration with Apple yielded a U2 special edition iPod that was a smash hit and gave visibility to the band at a time when most radio station playlists don't extend much beyond a narrow selection of pop singers. With iTunes, U2 produced what may be the industry's first downloadable version of a box set, offering the band's entire musical history for $149.

"We thought it was an opportunity to be taken with both hands," said Mr. McGuinness. Contrast that statement with anything from Hollywood on digital technology in the last three years.

DON'T EMBARRASS YOUR FANS Sure, U2 has recorded some clunkers (1997's "Pop" comes to mind) but the band works and reworks material until it has a whole album's worth of songs, no filler. Last Tuesday, the band played at least four of the songs from the current album, giving the songs a shot at entering the pantheon and affirming U2's status as a contemporary band, not a guilty pleasure or retro musical act that covers their own earlier greatness. (Quick, what's the last Rolling Stones' album?)

"Don't embarrass your fans," Bono told The New York Times last year. "They've given you a good life."

BE CAREFUL HOW YOU SELL OUT U2 has been offered as much as $25 million to allow a song to be used in a car commercial. No dice. They traded brands, not money, with Apple. Bob Dylan may wander around in a Victoria's Secret ad and The Who will rent "My Generation" to anybody with the wherewithal, but the only thing U2's music sells is U2. Just because it will fold and go in someone's pocket - The New Yorker publishing ads illustrated by its cartoonists comes to mind - does not mean it will be beneficial over the long haul.

EMBRACE POLITICIANS, NOT POLITICS I watched Bono, during the Republican Convention last year, hold Bill O'Reilly of Fox News rapt with a lengthy discussion of AIDS in Africa. Last summer, he posed for a photograph with President Bush, congratulating him for the work his administration had done for Africa.

"Their credibility is very strong," said Gary Bongiovanni, editor in chief of Pollstar, a trade magazine covering the concert industry. "I don't think there is anybody who doesn't believe that they are sincere in what they are doing."

(Bono came close to jumping the shark by donning a blindfold and miming a prison torture scene during "Bullet the Blue Sky," the band's fatwa against United States military intervention and then saying at the end of the song, "This is dedicated to the brave men and women of the U.S. military." Which of these things, Bono?)

IT'S CALLED SHOW BUSINESS FOR A REASON In 1980, I was standing with my sister at First Avenue bar in Minneapolis watching a then little-known band from Dublin take the stage. The Edge, the band's lead guitarist, kicked into a chiming, ringing salute, the opening chords of "I Will Follow." Bono ambled out, absently drinking a glass of water and when the drummer kicked in, Bono tossed the water into the lights above him, a mist enshrouding him - and us - as he stepped to the mike.

Much theatrical and musical combustion ensued, on that night and in the decades since. The current show is a testament to reinvestment, with a huge lighting and stage structure that managed to make Madison Square Garden seem like a cozy church, the backdrop for a secular sacrament. The Vertigo tour included seven curtains of lights, consisting of 12,000 individual bulbs, and a heart-shaped runway that may have wiped out a few hundred prime seats, but allowed thousands more to feel engaged as The Edge and Bono strode out along it during songs.

SEIZE THE MOMENT, BUT DON'T STEAL IT For years, U2 declined invitations to play at the Super Bowl, but the first one held after the attacks of Sept. 11 had special significance. Bono, in the middle of singing "Beautiful Day," slyly opened his coat to hundreds of millions of viewers and revealed it was lined with the American flag. The band adopted industrial and electronic motifs into their music in the 90's to give currency to their sound and then promptly stripped it down for the current tour. Not every gesture and instinct resonates: Let's not forget Bono's decision to go with a mullet in the mid-80's.

AIM HIGH As the central icon in the Church of the Upraised Fist - a temporary concert nation of gesturing frat boys, downloading adolescents and aging rockers reliving past glories - Bono can command his audience to do anything. During the concert last Tuesday, Bono asked the audience to send, via text message, their full names to One, an organization that fights AIDS and global poverty. They happily complied and their names were flashed on screen between encores. MTV's "Total Request Live" may attract a wider audience, but its members probably aren't made to think they are part of something bigger.

Copyright 2005 The New York Times Company

Complex interaction as a source of sustainable advantage.

Or more specifically, those interactions which require tacit expertise to simplify apparently complex situations offer the greatest opportunity to create potential bulwarks.

---

The next revolution in interactions

Successful efforts to exploit the growing importance of complex interactions could well generate durable competitive advantages.

Bradford C. Johnson, James M. Manyika, and Lareina A. Yee

Mckinsey Quarterly 2005 Number 4

---

An introductory note

Scott C. Beardsley, James M. Manyika, and Roger P. Roberts

Economists have long tended to describe the critical shifts in the European and North American labor markets over the past 200 years as movements between broad sectors—from agricultural to industrial jobs and from manufacturing to service ones. While this assessment is certainly true, the big picture obscures important nuances in what workers and professionals actually do. The finer details of the employment landscape hold important lessons for the way companies organize to manage their talent and technology, for competition within industries, and for public policy in developed nations.

In today's developed economies, the significant nuances in employment concern interactions: the searching, monitoring, and coordinating required to manage the exchange of goods and services. Since 1997, extensive McKinsey research on jobs in many industries has revealed that globalization, specialization, and new technologies are making interactions far more pervasive in developed economies. Currently, jobs that involve participating in interactions rather than extracting raw materials or making finished goods account for more than 80 percent of all employment in the United States. And jobs involving the most complex type of interactions—those requiring employees to analyze information, grapple with ambiguity, and solve problems—make up the fastest-growing segment.

This shift toward more complex interactions has dramatic implications for how companies organize and operate. In the mid-1990s, McKinsey studied the growing impact of interactions on the way people exchange ideas and information and how businesses cooperate or compete. In 1997, "A revolution in interaction" presented the findings of that research.

Over this past year, we looked closely at different kinds of interactions. Companies in many sectors are hiring additional employees for more complex interactions and fewer employees for less complex ones. For instance, frontline managers and nurses—who must exercise high levels of judgment and often draw on what economists call tacit knowledge, or experience- are in great demand. Workers who perform more routine interactions, such as clerical tasks, are less sought after. In fact, companies have been automating and outsourcing jobs that involve many of these transactional interactions.

The article that follows, "The next revolution in interactions," shows that the shift from transactional to tacit interactions requires companies to think differently about how to improve performance—and about their technology investments. Moreover, the rise of tacit occupations opens up the possibility that companies can again create capabilities and advantages that rivals can't easily duplicate.

Finally, "Mapping interactions by industry," a Web-exclusive series of interactive exhibits, examines the way tacit workers are deployed. In some industries, for instance, they create products and services, while in others they are concentrated largely in noncore areas such as administration, finance, and IT. In addition, each industry uses a different mix of tacit and transactional workers to manage its interactions with customers.

About the Authors

Scott Beardsley is a director in McKinsey's Brussels office, James Manyika is a principal in the San Francisco office, and Roger Roberts is a principal in the Silicon Valley office.

---

Like vinyl records and Volkswagen Beetles, sustainable competitive advantages are back in style—or will be as companies turn their attention to making their most talented, highly paid workers more productive. For the past 30 years, companies have boosted their labor productivity by reengineering, automating, or outsourcing production and clerical jobs. But any advantage in costs or distinctiveness that companies gained in this way was usually short lived, for their rivals adopted similar technologies and process improvements and thus quickly matched the leaders.

But advantages that companies gain by raising the productivity of their most valuable workers may well be more enduring, for their rivals will find these improvements much harder to copy. This kind of work is undertaken by, for example, managers, salespeople, and customer service reps, whose tasks are anything but routine. Such employees interact with other employees, customers, and suppliers and make complex decisions based on knowledge, judgment, experience, and instinct.

New McKinsey research reveals that these high-value decision makers are growing in number and importance throughout many companies. As businesses come to have more problem solvers and fewer doers in their ranks, the way they organize for business changes. So does the economics of labor: workers who undertake complex, interactive jobs typically command higher salaries, and their actions have a disproportionate impact on the ability of companies to woo customers, to compete, and to earn profits. Thus, the potential gains to be realized by making these employees more effective at what they do and by helping them to do it more cost effectively are huge—as is the downside of ignoring this trend.

But to improve these employees' labor performance, executives must put aside much of what they know about reengineering—and about managing technology, organizations, and talent to boost productivity. Technology can replace a checkout clerk at a supermarket but not a marketing manager. Machines can log deposits and dispense cash, but they can't choose an advertising campaign. Process cookbooks can show how to operate a modern warehouse but not what happens when managers band together to solve a crisis.

Machines can help managers make more decisions more effectively and quickly. The use of technology to complement and enhance what talented decision makers do rather than to replace them calls for a very different kind of thinking about the organizational structures that best facilitate their work, the mix of skills companies need, hiring and developing talent, and the way technology supports high-value labor. Technology and organizational strategies are inextricably conjoined in this new world of performance improvement.1

Raising the labor performance of professionals won't be easy, and it is uncertain whether any of the innovations and experiments that some pioneering companies are now undertaking will prove to be winning formulas. As in the early days of the Internet revolution, the direction is clear but the path isn't. That's the bad news—or, rather, the challenge (and opportunity) for innovators.

The good news concerns competitive advantage. As companies figure out how to raise the performance of their most valuable employees in a range of business activities, they will build distinctive capabilities based on a mix of talent and technology. Reducing these capabilities to a checklist of procedures and IT systems (which rivals would be able to copy) isn't going to be easy. Best practice thus won't become everyday practice quite as quickly as it has in recent years. Building sustainable advantages will again be possible—and, of course, worthwhile.

The interactions revolution

Today's most valuable workers undertake business activities that economists call "interactions": in the broadest sense, the searching, coordinating, and monitoring required to exchange goods or services. Recent studies—including landmark research McKinsey conducted in 19972—show that specialization, globalization, and technology are making interactions far more pervasive in developed economies. As Adam Smith predicted, specialization tends to atomize work and to increase the need to interact. Outsourcing, like the boom in global operations and marketing, has dramatically increased the need to interact with vendors and partners. And communications technologies such as e-mail and instant messaging have made interaction easier and far less expensive.

The growth of interactions represents a broad shift in the nature of economic activity. At the turn of the last century, most nonagricultural labor in business involved extracting raw materials or converting them into finished goods. We call these activities transformational because they involve more than just jobs in production.3 By the turn of the 21st century, however, only 15 percent of US employees undertook transformational work such as mining coal, running heavy machinery, or operating production lines—in part because in a globalizing economy many such jobs are shifting from developed to developing nations. The rest of the workforce now consists of people who largely or wholly spend their time interacting.

Within the realm of interactions, another shift is in full swing as well, and it has dramatic implications for the way companies organize and compete. Eight years after McKinsey's 1997 study, the firm's new research on job trends in a number of sectors finds that companies are hiring more workers for complex than for less complex interactions. Recording a shipment of parts to a warehouse, for example, is a routine interaction; managing a supply chain is a complex one.

Complex interactions typically require people to deal with ambiguity—there are no rule books to follow—and to exercise high levels of judgment. These men and women (such as managers, salespeople, nurses, lawyers, judges, and mediators) must often draw on deep experience, which economists call "tacit knowledge." For the sake of clarity, we will therefore refer to the more complex interactions as tacit and to the more routine ones as transactional. Transactional interactions include not just clerical and accounting work, which companies have long been automating or eliminating, but also most of what IT specialists, auditors, biochemists, and many others do (see sidebar, "About the research").

Most jobs mix both kinds of activities—when managers fill out their expense reports, that's a transaction; leading workshops on corporate strategy with their direct reports is tacit work. But what counts in a job are its predominant and necessary activities, which determine its value added and compensation.

During the past six years, the number of US jobs that include tacit interactions as an essential component has been growing two and a half times faster than the number of transactional jobs and three times faster than employment in the entire national economy. To put it another way, 70 percent of all US jobs created since 1998—4.5 million, or roughly the combined US workforce of the 56 largest public companies by market capitalization—require judgment and experience. These jobs now make up 41 percent of the labor market in the United States (Exhibit 1). Indeed, most developed nations are experiencing this trend.

The balance is tipping toward complexity, in part because companies have been eliminating the least complex jobs by streamlining processes, outsourcing, and automating routine tasks. From 1998 to 2004, for example, insurance carriers, fund-management companies, and securities firms cut the number of transactional jobs on their books by 10 percent, 6.5 percent, and 2.7 percent a year, respectively. Likewise, a more automated check-in process at airports makes for smaller airline check-in staffs, automated replenishment systems reduce the need for supply chain bookkeepers, and outsourcing helps companies shed IT help desk workers. Manufacturers too have eliminated transactional jobs.

Meanwhile, the number of jobs involving more complex interactions among skilled and educated workers who make decisions is growing at a phenomenal rate. Salaries reflect the value that companies place on these jobs, which pay 55 and 75 percent more, respectively, than those of employees who undertake routine transactions and transformations.

Demand for tacit workers varies among sectors, of course. The jobs of most employees in air transportation, retailing, utilities, and recreation are transactional. Tacit jobs dominate fields such as health care and many financial-services and software segments (Exhibit 2). But all sectors employ tacit workers, and demand for them is growing; most companies, for example, have an acute need for savvy frontline managers.

The demand for tacit employees and the high cost of employing them are a clear call to arms. Companies need to make this part of the workforce more productive, just as they have already raised the productivity of transactional and manufacturing labor. Unproductive tacit employees will be an increasingly costly disadvantage.

The point isn't how many tacit interactions occur in a company—what's important is that they ought to add value. This shift toward tacit interactions upends everything we know about organizations. Since the days of Alfred Sloan, corporations have resembled pyramids, with a limited number of tacit employees (managers) on top coordinating a broad span of workers engaged in production and transactional labor. Hierarchical structures and strict performance metrics that tabulate inputs and outputs therefore lie at the heart of most organizations today.

But the rise of the tacit workforce and the decline of the transformational and transactional ones demand new thinking about the organizational structures that could help companies make the best use of this shifting blend of talent. There is no road map to show them how to do so. Over time, innovations and experiments to raise the productivity of tacit employees (for instance, by helping them collaborate more effectively inside and outside their companies) and innovations involving loosely coupled teams will suggest new organizational structures.

The two critical changes that executives must take into account as they explore how to make tacit employees more productive are already clear, however. First, the way companies deploy technology to improve the performance of the tacit workforce is very different from the way they have used it to streamline transactions or improve manufacturing. Machines can't recognize uncodified patterns, solve novel problems, or sense emotional responses and react appropriately; that is, they can't substitute for tacit labor as they did for transactional labor. Instead machines will have to make tacit employees better at their jobs by complementing and extending their tacit capabilities and activities.

Second, a look back at what it took to raise labor productivity over the past ten years shows that the overall performance of sectors improves when the companies in them adopt one another's managerial best practices, usually involving technology. In retailing, for instance, Wal-Mart Stores was a pioneer in automating a number of formerly manual transactional activities, such as tracking goods, trading information with suppliers, and forecasting demand. During the 1990s, most other general-merchandise retailers adopted Wal-Mart's innovations, boosting labor productivity throughout the sector.4

But in the world of tacit work, it's less likely that companies will succeed in adopting best practices quite so readily. Capabilities founded on talented people who make smarter decisions about how to deploy tangible and intangible assets can't be coded in software and process diagrams and then disseminated throughout a sector.

Tacit technology

Companies have three ways of using technology to enhance and extend the work of tacit labor. First, and most obviously, they can use it to eliminate low-value-added transactional activities that keep employees from undertaking higher-value work. Pharmacies, for example, are using robots to fill prescriptions in an effort to maximize the amount of time pharmacists can interact with their customers. Meanwhile, The Home Depot is trying out automated self-checkout counters in some stores. The retailer isn't just automating and eliminating transactional tasks; its chairman and CEO, Robert Nardelli, believes that automated counters can reduce by as much as 40 percent the time customers spend waiting at cash registers. Just as important, the new counters mean that people who used to operate the old manual ones can be deployed in store aisles as sales staff—a much higher-value use of time.

Furthermore, technology can allocate activities more efficiently between tacit and transactional workers. At some companies, for example, technology support—traditionally, tacit work undertaken by staff experts on PCs and networks—has been split into tacit and transactional roles. Transactional workers armed with scripts and some automated tools handle the IT problems of business users; only when no easy solution can be found is a tacit employee brought in.

Second, technology makes it possible to boost the quality, speed, and scalability of the decisions employees make. IT, for instance, can give them easier access to filtered and structured information, thereby helping to prevent such time wasters as volumes of unproductive e-mail. Useful databases could, say, provide details about the performance of offshore suppliers or expanded lists of experts in a given field. Technology tools can also help employees to identify key trends, such as the buying behavior of a customer segment, quickly and accurately.

Kaiser Permanente is one of the organizations now pioneering the use of such technologies to improve the quality of complex interactions. The health care provider has developed not only unified digital records on its patients but also innovative decision-support tools, such as programs that track the schedules of caregivers for patients with diabetes and heart disease. Although it is hard to determine quantitatively whether physicians are making better judgments about medical care, data suggest that Kaiser has cut its patients' mortality rate for heart disease to levels well below the US national average.

Finally, new and emerging technologies will let companies extend the breadth and impact of tacit interactions. Loosely coupled systems are more likely than hard-coded systems and connections to be adapted successfully to the highly dynamic work of tacit employees. This point will be particularly critical, since tacit interactions will occur as much within companies as across them.5 Broadband connectivity and novel applications (including collaborative software, multiple-source videoconferencing, and IP telephony) can facilitate, speed up, and progressively cut the cost of such interactions as collaboration among communities of interest and build consensus across great distances. Companies might then involve greater numbers of workers in these activities, reach rural consumers and suppliers more effectively, and connect with networks of people and specialized talent around the world.6

Competitive advantage redux

Technology itself can't improve patient care or customer service or make better strategic decisions. It does help talented workers to achieve these ends, but so, for example, do organizational models that motivate tacit employees and help them spot and act on ideas. These kinds of models usually involve environments that encourage tacit employees to explore new ideas, to operate in a less hierarchical (that is, more team-oriented and unstructured) way, and to organize themselves for work. Most of today's organizational models, by contrast, aim to maximize the performance of transactional or transformational workers. Tacit models are new territory.

As a result, it won't be easy for companies to identify and develop distinctive new capabilities that make the best use of tacit interactions—new ways to speed innovations to market, to make sales channels more effective, or to divine customer needs, for instance. But at least such capabilities will also be difficult for competitors to duplicate. Best practices will be hard to transplant from one company to another if they are based on talented people supported by unique organizational and leadership models and armed with a panoply of complementary technologies. If it becomes harder for performance innovations to spread through a sector and thereby to boost the performance of all players, it will once again be possible to build operating-cost advantages and distinctive capabilities sustainable for more than a brief moment.

During the past few years, advantages related to costs and distinctiveness have rarely lasted for long: they eroded quickly when companies built them from innovations in the handling of what are essentially transactional interactions. E*Trade Financial, for instance, gained tactical advantages by optimizing transactional activities to create more efficient and less expensive ways of making trades but then watched its unique position evaporate when other discount brokers and financial advisers embraced the new technology and cut their trading fees. Cheap trades were no longer a sufficient point of differentiation.

By contrast, advantages built on tacit interactions might stand. A company could, for example, focus on improving the tacit interactions among its marketing and product-development staff, customers, and suppliers to better discern what customers want and then to provide them with more effective value-added products and services. That approach would create a formidable competitive capability—and it is difficult to see how any rival could easily implement the same mix of tacit interactions within its organization and throughout its value chain.



Looking forward

As companies explore how to expand the potential of their most valuable employees, they face more than a few challenges. For one thing, they will have to understand what profile of interactions—transactional and tacit—is critical to their business success and to allocate investments for improving the performance of each. Some companies will have to redeploy talent from transactional to tacit activities, as Home Depot did. Others, following the example of companies such as Toyota Motor and Cisco Systems, may find it necessary to redeploy their available tacit capacity to transformational and transactional activities, thus bringing a new level of problem solving to many kinds of transformational jobs. At the same time, it will be necessary to guard against becoming overly reliant on a few star tacit employees and to manage critical tacit or transactional activities undertaken by partners or vendors.

On the human-resources side, companies will need a better understanding of how they can hire, develop, and manage for tacit skills rather than transactional ones—something that will increasingly determine their ability to grow. Certain organizations must therefore learn to develop their tacit skills internally, perhaps through apprenticeship programs, or to provide the right set of opportunities so that their employees can become more seasoned and knowledgeable. What's more, performance is more complex to measure and reward when tacit employees collaborate to achieve results. How, after all, do you measure the interactions of managers?7

Companies will also have to think differently about the way they prioritize their investments in technology. On the whole, such investments are now intended largely to boost the performance of transformational activities—manufacturing, construction, and so on—or of transactional ones. Companies invest far less to support tacit tasks (Exhibit 3).

So they must shift more of their IT dollars to tacit tools, even while they still try to get whatever additional (though declining) improvements can be had, in particular, from streamlining transactions. The performance spread8 between the most and least productive manufacturing companies is relatively narrow. The spread widens in transaction-based sectors—meaning that investments to improve performance in this area still make sense. But the variability of company-level performance is more than 50 percent greater in tacit-based sectors than in manufacturing-based ones (Exhibit 4). Tacit activities are now a green pasture for improvement.

About the research

The next wave of performance improvements—to raise the effectiveness of tacit workers—will be far more difficult than the improvement efforts of the past. But companies that can innovate to make their complex, higher-value business activities deliver what their customers care about most will probably gain significant (and not easily duplicated) advantages in distinctiveness, quality, and cost.

We looked at the range of business activities involved in more than 800 occupations in the United States. Building on McKinsey's 1997 study, we placed every job in one of three categories: transformational (extracting raw materials or converting them into finished goods), transactional (interactions that unfold in a generally rule-based manner and can thus be scripted or automated), and tacit (more complex interactions requiring a higher level of judgment, involving ambiguity, and drawing on tacit, or experiential, knowledge). While any kind of work clearly involves activities in all three of our categories, we placed each job by determining its predominant activity. This occupational segmentation allowed us to develop a macroeconomic view of employment and wage shifts and to isolate trends in tacit interactions. We cross-checked the results with the 1997 activity-level analysis and with other economists' findings on interactions.

Then we linked the occupational analysis to the US government's industry classifications and quantified the mix of tacit, transactional, and transformational activities within and across industries. In addition, we used data from the International Labour Organization, the World Bank, and other sources to analyze these trends on a global basis. Finally, interviews with economists and with functional and industry experts throughout McKinsey helped us to identify and understand the key enablers of tacit and transactional interactions in today's companies.

Return to reference

About the Authors

Brad Johnson is an associate principal in McKinsey's Silicon Valley office, and James Manyika is a principal in the San Francisco office, where Lareina Yee is a consultant.

The authors wish to acknowledge the contributions of their colleagues Scott Beardsley, Lowell Bryan, Luis Enriquez, Dan Ewing, Diana Farrell, Sumit Gupta, Lenny Mendonca, Navin Ramachandran, and Roger Roberts, as well as of Douglas Frosst, John Hagel, and Hal Varian.

Notes

1 Lowell L. Bryan and Claudia Joyce, "The 21st-century organization," The McKinsey Quarterly, 2005 Number 3, pp. 24–33; and Lowell L. Bryan, "Getting bigger," The McKinsey Quarterly, 2005 Number 3, pp. 4–5.

2 Patrick Butler, Ted W. Hall, Alistair M. Hanna, Lenny Mendonca, Byron Auguste, James Manyika, and Anupam Sahay, "A revolution in interaction," The McKinsey Quarterly, 1997 Number 1, pp. 4–23.

3 Douglass C. North, "Institutions, Transaction Costs, and Productivity in the Long Run," Washington University at St. Louis economics working paper, economic history series, number 9309004, September 1993; Douglass C. North, "Transaction Costs Through Time," Washington University at St. Louis economics working paper, economic history series, number 9411006, November 1994; and Douglass C. North, "Institutions and Productivity in History," Washington University at St. Louis economics working paper, economic history series, number 9411003, November 1994. All are available online.

4 Brad Johnson, James Manyika, and Lenny Mendonca, US Productivity Growth 1995–2000: Understanding the Contributions of Information Technology Relative to Other Factors, McKinsey Global Institute, October 2001; Diana Farrell, Terra Terwilliger, and Allen P. Webb, "Getting IT spending right this time," The McKinsey Quarterly, 2003 Number 2, pp. 118–29; and Diana Farrell, "The real new economy," Harvard Business Review, October 2003, Volume 81, Number 10, pp. 104–12.

5 John Seely Brown and John Hagel III, "Flexible IT, better strategy," The McKinsey Quarterly, 2003 Number 4, pp. 50–9.

6 Scott Beardsley, Luis Enriquez, Carsten Kipping, and Ingo Beyer von Morgenstern, "Telecommunications sector reform—A prerequisite for networked readiness," Global Information Technology Report 2001–2002: Readiness for the Networked World, World Economic Forum, Oxford University Press, June 2002, pp. 118–37.

7 Lowell L. Bryan, "Making a market in knowledge," The McKinsey Quarterly, 2004 Number 3, pp. 100–11.

8 As measured by revenue or EBITDA (earnings before interest, taxes, depreciation, and amortization) per employee.

Thursday, December 01, 2005

Vanity: thy name is golden.

BATTLE FOR THE FACE OF CHINA

L'Oreal, Shiseido, Estee Lauder --the world's leading cosmetics companies are vying for a piece of a booming market.

SHERIDAN PRASSO
3,271 words
12 December 2005
Fortune
U.S. Edition
156
English
© 2005 Time Incorporated.

At the Pacific department store in Shanghai, lanky 24-year-old makeup stylist Jin Jia is all hustle and flow. Dressed in black, safety pins stuck through his shirt, he saunters suggestively around the MAC cosmetics counter with mascara wand and powder brush to the hip-hop thump of Eminem's "Lose Yourself." The music is loud, simultaneously luring and intimidating, and the young women in the store are intrigued. They want this experience of glam, in-your- face modernity. They may not know that MAC is owned by Estee Lauder, but they know it is ku, or cool. And most of the women who sit for Jin's makeovers buy something--smoky eyeshadow for $17.50, foundation cream for $40. No, not cheap. Chic. This is the Shanghai of the roaring 2000s, three generations after it was the fashion capital of the Orient, the Paris of the East.

There aren't many Shanghainese around anymore who remember those precommunist days, when women slit their qipao dresses up their thighs and painted their lips lantern red. Their great-grandchildren are starting from scratch.

Across the aisle a clinician at Shiseido's Aupres counter is demonstrating a herb- infused face cream developed especially for Chinese skin at Shiseido's R&D center in Beijing. Dressed in medical white, she rubs Eternal Total Recharge's creamy silk onto the back of a customer's hand and tries to explain, over the music, its effectiveness in creating skin that is "dewy soft." (The literature says something about fibroblasts and collagen and resilience from an emulsion of Chinese asparagus-tuber extract.) The appeal works for the women who have bypassed the MAC counter. Here they can take refuge in a more familiar tradition, rooted in centuries of Chinese aesthetics that equate beauty with ivory skin.

Modernity and tradition--it is a raging battle among global beauty giants vying to win the face of Chinese women. There's French giant L'Oreal pitted against Japan's Shiseido, both of which are being challenged by U.S. leader Estee Lauder and a handful of Chinese companies that draw upon the desire for traditional skin beautifiers by putting herbs and animal proteins into their products. Their ads line every block of Shanghai's Nanjing Road, and almost every weekend brings a promotion for a new makeup line, with dancing girls and free makeovers, outside Shanghai department stores. Dior is in the race too. And the American companies Amway and Avon, which got hammered by a 1998 government edict prohibiting door-to-door sales that is supposed to be lifted, albeit with restrictions, by the end of the year. Sephora, the cosmetics arm of LVMH, opened its first store in Shanghai in April and plans 100 in China by 2010.

And those are just the Davids. The Goliath is Procter & Gamble, which has long held the leadership position in the broadly defined cosmetics and toiletries industry because its Olay whitening skin creams outsell every other brand, and its shampoos (Rejoice, Head & Shoulders, and Pantene) hold the top three hair-care positions in China. In March, P&G plunged into the cosmetics competition, too, launching Cover Girl and Max Factor to compete with L'Oreal's Maybelline. Along with its high-end SK-II skin-care line and its other household and food brands, P&G hauls in more than $2 billion in annual sales in China--roughly 70% of that from hair and skin products, compared with a fifty-fifty split in other countries. "We are the beauty company of the P&G company," says Daniela Riccardi, P&G's president for Greater China. "Nowhere else is beauty such an important part of the business."

All that may sound surprising, because China is a country where barely any women used cosmetics a decade ago. When the authorities stopped discouraging lipstick and other bourgeois displays of beauty in the early 1990s, Chinese women were eager to embrace the aesthetics of the modern world. As a result, the beauty industry has been growing at breathtaking speed--doubling since 1998 into a $7.9 billion market that is expected to climb to $9.6 billion by 2009, according to market research firm Access Asia. Some 90 million urban women in China spend 10% or more of their income on face cream, lipstick, mascara, and the like, particularly in fashionable Shanghai, where women spend 50 times more per capita on cosmetics than women nationwide. The makeup component of the beauty market in China is forecast to have sales of $524 million this year, rising to $705 million by 2009, Access Asia predicts. And early entrants who have had time to build market share have seen results: L'Oreal's 2004 sales of $350 million (including makeup and other beauty products), were up 58% over the previous year; Shiseido's China sales last year were $204 million, up 27%.

The beauty business is considered a bellwether of the overall consumer products market, reflecting desire and discretionary income rather than essential day-to-day living. And it's one of China's most dynamic markets, spurred by rising incomes and increasing spending power. "It's the only big market growing this fast in China," says Jacques Penhirin, a McKinsey partner who tracks retailing. "This is a market that was almost nonexistent 15 years ago, and about 70% of it remains to be developed."

Chinese tradition means that just about every woman there equates beauty with fine skin. Poems from the Tang dynasty and even a millennium earlier describe beautiful women as "jade white" and "creamy tinted." China's ancient tales have mothers giving their daughters pots of face cream to take on long journeys. Today even urban grandmothers still drink a concoction of ground pearls mixed in water in the belief that it keeps the skin white. "Chinese people ask for even whiter tone than what is selling well in Japan," says Tadakatsu Saito, chairman of China operations for Shiseido, which has the most experience of all the multinationals with whitening because of the huge market in Japan. "When we try to sell them their exact color, they say, 'Too dark. Do you have anything lighter, brighter?' "

The modern desire for whitening is sometimes mistaken as a desire to take on the trappings of Western beauty aesthetics in a country where popular surgeries for eye widening, breast implants, and a more prominent nose bridge are exactly that. Commercials advertising such cosmetic procedures are ubiquitous on Chinese TV, and it's the rare Shanghai taxi without an ad for them over the seat back. But the value of clear white skin is deeply rooted: Herbal recipes for keeping courtesans' faces translucent can be found in ancient imperial records. Women in China who can afford it think little of spending at least as much money on facial moisturizer as on clothes. That would be unusual in the U.S., where 87% of women spend less than $20 when they buy skin cream.

But at Plaza 66, an upscale mall in Shanghai, fashionable Shanghainese women toting designer purses and Starbucks coffee cups pause all day long to buy expensive moisturizer at the counter of La Mer, another Estee Lauder brand. One of them, Qiao Hong, 42, is picking out a jar of face cream ($287.50) and eye-lifting serum ($418.75) that she plans to send her husband back to purchase later in the day. She's a university professor. Her husband owns a garment factory. While they are well off, $700 is still a lot of money to spend on potions, but the counter girls report that Qiao isn't unusual: A typical day finds 20 to 25 women doing the same. "It's necessary for a woman to treat herself well, and my husband also agrees that it's important for a lady to look good," Qiao says. "It's really worth the money. With money, you can just make more of it, but your skin--if you lose your beauty and youth, you cannot get it back." La Mer's counter has 32 other women on a waiting list to purchase a 500-milliliter jar of face cream. Cost: $1,750. Shiseido finds similar demand for its Cle de Peau line, which, at $500 for 30 grams, is more expensive than gold. On any given Saturday afternoon, the comfy chairs at the Cle de Peau sections of department stores are filled with buyers and with women waiting to take their places.

L'Oreal's new laboratory in Pudong is a 32,000-square-foot facility stocked with pigments, waxes, and oils. Didier Saint- Leger, a biochemist, oversees the microscopes and chromometers that measure the effectiveness of skin-whitening creams, the ovens that heat emulsions to test stability, and the two-way mirrors that enable him to observe the way Chinese women apply face creams and makeup. The center opened in September with 43 Chinese researchers, most of them chemists. Next year, when L'Oreal completes construction of Phase II, currently an empty field at the back of the lab, there will be 75. Chinese herbs, roots, and flowers will be tested there, distilled and researched for their impact on skin and hair. Hua jiao, the flower of the prickly ash tree that adds tongue- scorching spice to Sichuan cuisine, is reputed to clear up acne and will be among them, as will traditional whitening agents such as ginkgo leaf, ginseng, and mulberry.

"We cannot separate the beauty and the culture," Saint-Leger says. "I am trying not to think like a Caucasian. Basically, human beings, 95% of the world, have dark hair and eyes. Caucasians could be considered mutants." (Saint-Leger also hopes to unlock such secrets as why Chinese women's skin wrinkles, on average, at a rate ten years behind that of French women.)

The R&D center is part of L'Oreal's transition from the image it currently projects in China--its Chinese name, Oulaiya, means "elegance coming from Europe," and its ads feature pinkish colors on white faces--to something more recognizably Chinese. Its recent acquisitions of Yue Sai, for a decade the most popular cosmetics brand in China, and the low-end skin-care line Mininurse are steps in that direction. But Shiseido is already ahead in the traditional- formulations game, with its "Chinese national brand" Aupres and its recently expanded three-year-old Beijing R&D center. It has already launched its first product (Eternal Total Recharge) and has more on the way.

Not to be left out, Estee Lauder opened its own 15-scientist Innovation Institute in Pudong in November, stacked with pigments with names such as "gleamer flake" and "magic mauve." As a latecomer, Estee Lauder's strategy has been to build a finally-this- glamorous-American-brand-is-available-in-China buzz for its prestige lines long before they go on sale. It also gives away cosmetics to China's leading makeup artists to encourage them to experiment on models for shows and movies. "We have been watching and feeling the pulse of the China market for a long time," says Carol Chen, Lauder's Taiwan-born general manager and the only ethnic-Chinese woman heading a major beauty company in China. "The market just wasn't that ready before."

Now, clearly, it is. "China changes so fast," Chen says, looking out her Shanghai office window high above the haze- covered city. "You blink, and the market is there." Chen used the same strategy to launch the Clinique and Estee Lauder lines, advertising in the Chinese edition of Elle years before they were for sale in China. She was based in Hong Kong at the time and understood the importance of China's jet-setting sophisticates, who read fashion magazines and make shopping trips to Hong Kong, Europe, and the U.S., then set trends when they get home. Being Chinese, she says, gives her an advantage over Europeans and Japanese. "They also have a deep understanding of the markets," says Chen. "But for me, it's faster."

Selling to the masses in China is something else entirely. Rural China is still largely unpenetrated by beauty products, and with retailing and distribution still badly managed in China's hinterlands, the companies that have the best strategies for reaching the women there, rather than the minority who shop for imports at department-store counters, ultimately will win the largest market share. "There is a huge opportunity to make the whole cake bigger," says Austin Lally, P&G's vice president in China. "In hair care there's potential to quintuple the size of the category." That means getting rural women who wash their hair once a week to wash daily. For those who use only shampoo, the next step is to get them to buy conditioner and ultimately coloring, gel, and mousse. For women who use only cleanser and skin moisturizer, there are toners, exfoliants, and facial masks. "Women in Japan and Korea use seven or eight steps for skin care," says Haw Diann Wai, Estee Lauder's Malaysian-Chinese product-development manager. "We're not that sophisticated yet."

Paolo Gasparrini, president of L'Oreal China, is leaning over a coffee table drawing a pyramid. He is explaining how he is trying to reach the consumers at the bottom, where household incomes average $37 to $50 a month. The small-store distribution channels he acquired with Mininurse--which has been reformulated and priced at less than $2--allows L'Oreal to reach into 280,000 stores nationwide, giving the company the potential to launch shampoos through those channels as well. The number of Chinese households with incomes above $625 a month is expected to reach 22.4 million by 2008, so there's lots of potential to expand L'Oreal's mid-range lines through China's mushrooming network of hypermarkets.

L'Oreal may also begin introducing products differentiated by region--heavier creams for China's cold northern climes and lighter ones for the tropical south--as well as by skin tone. "Segmentation is something that's becoming more and more important," says Gasparrini. "There's still huge space in the market to take." L'Oreal, like all beauty companies in China, finds there also is space to educate Chinese consumers, who know little about applying cosmetics or why they need exfoliants. "In other countries women learn how to use cosmetics from the mom," says Gasparrini. "That's not the case in China. We have to substitute the mom."

It may be Shiseido, however, that has the most thorough and comprehensive outreach strategy with its local brands--endeavoring to reach Chinese women in the thousands of smaller cities who have not had access to cosmetics for decades, if ever. Shiseido has built a chain of 25,000 stores in Japan, a country one-25th China's size. Last year it started doing the same thing in China. Apart from the 400 Chinese department stores selling Shiseido brands in major cities, the company is rolling out what it calls "voluntary chain stores" across China's hinterlands to reach out to lower-end consumers. It finds mom-and-pop retailers willing to allow Shiseido to install a counter, put up a Shiseido sign, and hire full-time staff. Shiseido takes care of training and checks on the store's progress weekly. The company is selling hundreds of products it has developed and manufactured through two joint ventures in China, under brands with names such as Za, Uno, and Fitit. After making many of the products available in 4,172 small retail outlets in China, Shiseido opened 700 freestanding stores in the past year and plans 5,000 by 2008.

"We are the best at this system," Shiseido's president and CEO Shinzo Maeda boasts from a comfortable, cream leather chair at Shiseido headquarters in the Ginza section of Tokyo. Ultimately, Maeda says, Shiseido expects to double its annual China sales, on operating margins of more than 20%, and make China contribute one- fourth of Shiseido's overall global revenue. "Ten years ago we thought that 1% of Chinese women would be Shiseido customers," says Masaru Miyagawa, president and CEO of Shiseido China. "Now we think that 10% of Chinese women will be Shiseido customers."

There are Chinese companies in the beauty game as well, and being Chinese, they are best poised to play the tradition card. One of those is Shanghai Herborist Cosmetics, a division of state-owned Jahwa. "We're the Body Shop of China," says Herborist's fashionable brand manager, Lily Xu. Herborist makes 130 products for use from head to toe; one of its most popular is a "whitening revitalizing mask," which uses seven herbs that claim to lighten skin tone in 15 minutes, producing faces "as white as a lotus seed." Herborist now has 180 freestanding boutiques in 40 cities in China and plans 100 more by next year. Xu is excited about a recent agreement with Sephora to sell Herborist products in its China stores and ongoing discussions with the company to sell Herborist abroad. "Back to nature is the cosmetics and skin-care trend in fashion now," says Xu. "Once Origins or L'Occitane come to China, then we'll have real competition. But we'll still be able to distinguish ourselves as coming from Chinese tradition and Chinese medicine."

It's all about mixing tradition and modernity to reach Chinese consumers, says Yue Sai Kan, a TV celebrity in China who founded the cosmetics line acquired by L'Oreal. Sitting on a plush sofa in her New York City townhouse, where she spends her time when not working on a new show in Shanghai, she opens a book of paintings from the Tang dynasty, when China was ruled by an empress, Wu Zetian. She points out the red lips and painted brows in the depictions of women who lived more than 1,000 years ago. "See, Chinese women have always used cosmetics," she says. "In the Tang dynasty they used as many steps of makeup as we do today. Chinese women had been discouraged from using cosmetics for 35 years. It was a world of darkness, of no color. Now it is changing. What you have to do is give them international, yes, but Chinese have a lot of pride in themselves and their traditions. The best thing you can give them is a belief in themselves."

And 37 shades of red to choose from.

Someone who could do...and teach.

The Education of Andy Grove

A Harvard historian explains how Intel's legendary chief became the best model we have for leading a business in the 21st century.

AN ESSAY BY RICHARD S. TEDLOW

5,730 words
12 December 2005
Fortune
U.S. Edition
116
English
© 2005 Time Incorporated.

In 1991, an instructor at Stanford's Graduate School of Business presented his class with a case study. It went like this: A CEO was scheduled to address a major industry gathering, and he could give one of three speeches. The first would publicly commit his company to incorporating a sexy, sophisticated new technology in its products. The second speech would reaffirm the company's commitment to developing its existing technology. The third speech would do neither, leaving the decision to "the market." The stakes were enormous: A wrong decision could well ruin the business. What should the CEO do? The question was more than academic, because the CEO described in the case was also the man at the front of the classroom. Dr. Andrew S. Grove, like professor Indiana Jones, was better known for his exploits as "Andy," the famous leader of Intel Corp. But unlike Indy, Grove wasn't simply biding time here between adventures.

His question was meant not just to challenge students' thinking but to advance his own. That big speech was three weeks away, and Grove had yet to make up his mind. He didn't know the answer.

It's not common for any CEO to stand before an audience and say, "I don't know what to do. What do you think?" It's even less common for that CEO to listen to the responses and take them seriously. But Grove, 69, has never lost track of the truth: that Intel has always been one wrong answer away from disaster--and that a closed mind is a trap door to the abyss.

Grove and Intel are now embedded so deeply Inside our minds, our computers, and our culture--the man has been on 77 magazine covers, by one count--that with hindsight, their success seems foreordained. But the opposite is the case: By all odds, Intel should have failed. It should have been destroyed by the same brutal international competition that has killed apparel companies, tire companies, and television companies, or fallen into obscurity like Zilog and other successful chipmakers. Intel, too, should have stumbled on the terrifying treadmill of Moore's Law, which requires betting billions upon billions of dollars on ever more costly factories to make chips you're still developing for customers who've yet to demand them. It should have been eclipsed by an upstart competitor with a better mousetrap. Intel's success should never have happened--it was an anomaly, an outlier, a freak.

That's why Grove had chosen himself as the day's case study in the class he was teaching with professor Robert Burgelman, his longtime collaborator and the author of Strategy Is Destiny. In business you often don't see the cliff until you've already walked over it. Visibility on the ground is bad, and the roadmap--well, that can't be trusted either. To spot the next cliff, Andy Grove was willing to let go of his instincts--since they could be wrong--and view himself as a student might: from outside, peering down with the wide-angle, disinterested perspective of the observer. Did the man below seem aware of his surroundings? Was he choosing the correct path? Was there a 1,000-foot drop ahead?

Normally, our society observes a division of labor. Musicians don't critique, and critics don't compose. Quarterbacks decide on Sunday, and fans deride on Monday. It is the singular ability to inhabit both roles at once--subject and object, actor and audience, master and student--that sets Grove apart. And it's why, for everything that has been written by and about him, we have yet to appreciate his biggest legacy. Andy Grove is America's greatest student and teacher of business.

By analyzing the decisions he made on the road to becoming a great leader, you can learn to hone your own leadership skills. Because there's no gain in being able to recruit great employees, handle a board, dazzle Wall Street, or rally your cavalry for a glorious charge at dawn's early light if you haven't figured out which way to point the horses.

Grove's output as a teacher of management has been prodigious. He has taught from the lectern, in the op-ed pages, in his famous (sometimes feared) one-on-one sessions, and with his books, including 1983's High Output Management and 1996's Only the Paranoid Survive, whose title entered the lexicon along with its phrase "strategic inflection point," which Grove defines as "a time in the life of a business when its fundamentals are about to change." His teaching would have been an impressive career in itself. Yet it is one thing to search for truth in the ivory tower and quite another to take those lessons, however wrenching, and apply them to a living, breathing business like Intel. Grove's most powerful lessons have been in the doing.

What can others learn from Grove's odyssey? As we face a future where change is not only constant but accelerating, reality will transform itself more swiftly than most humans--or most companies-- are hard-wired to handle. Even startups that overturn one reality are easily overturned by the next big change. Grove has escaped natural selection by doing the evolving himself. Forcibly adapting himself to a succession of new realities, he has left a trail of discarded assumptions in his wake. When reality has changed, he has found the will to let go and embrace the new.

It's a performance as remarkable as his life story. There will not be another CEO who survived both the Nazis and the communists before becoming a naturalized capitalist. And yet Grove is the best model we've got for doing business in the 21st century. If you hope to thrive in an environment of rapid change, it is this outlier-- his strengths forged in a distant and vanished world--that you should follow. Begin your lesson in leadership the same way Andy Grove attacks a problem: by setting aside everything you know.

As a historian whose subjects have been, until now, no longer living, I found it a jolt to face a very alive Andy Grove. When he gets to a particularly intense point in a conversation, Grove leans forward and fixes you directly with his eyes, which are a startling blue. "That is not the right question," he will say, briefly taking over the duties of the interviewer. It's not personal. It's about an invisible third party: the truth. The truth is so precious and so hard to coax into view--surrounded by its bodyguard of politics and half-truths--that there is simply no time for fuzzy thinking. There are moments when you can almost experience firsthand the flow of self that went into Intel. And Grove's state-of-the-art memory can transport you from the deck of his home--where a commanding view of Silicon Valley spreads out at his feet--to vivid places in time. Like the day not long after the Stanford case study when Intel executives Craig Kinnie and Dennis Carter arrived in his cubicle to confront him.

In the run-up to the speech about technology choices, Grove had uncharacteristically wavered. He'd told Stanford's Burgelman that he was inclined to stick with Intel's mainstay chip technology known as CISC (for complex instruction set computing--don't ask). But when Intel published its annual report, the cover included a new, fashion- forward RISC chip (for reduced instruction set computing). Engineers across the industry were enamored of RISC because of its elegance: It required fewer transistors to accomplish most computing tasks. Grove had even appeared in an Intel rap video to promote RISC.

But Kinnie and Carter had trained at the Grove school of management--Grove's MO as a leader has always been to depend on "helpful Cassandras" to make sure that he doesn't win an argument he ought to lose. The two were blunt. "Andy, you can't do this," Carter said. Abandoning CISC for RISC, they argued, would truncate one of the most profitable franchises in business history for ... what? Leveling the playing field for Intel's competition? When the discussion ended, Kinnie and Carter had achieved a feat of monumental difficulty. They'd won an argument with Andy Grove.

Grove has been grateful to them ever since. He looks back at this episode with anger--at himself. "We almost wrecked the company," he told me. "We had established our technology as the industry standard. This franchise was worth millions, billions. We ... I ... almost walked away from it because the elegance of a new product seduced me into taking my eye off the market." The sun is shining, the view is stunning, and Andy Grove is berating himself for a mistake he didn't make a decade and a half ago. It's a measure of the demanding life he has lived--a life that, at critical junctures, has hung on Grove's ability to transform himself, to move from role to role as the moment required.

The Early Adapter

To be born a Hungarian Jew in 1936 was to be born on the wrong side of history. Grove was forced to adapt to a succession of threatening realities from the very beginning.

Transformations were the story of Grove's young life. When the Nazis invaded Hungary in 1944, his mother changed his name from Andras Grof to the Slavic Andras Malesevics. When the communists arrived the following year, he once again became Andras Grof. As a young man, he switched from journalism to chemistry after publishers started rejecting his articles for political reasons.

Communism nauseated him. One of his most vivid recollections is the May Day parade of 1950. Cheering was broadcast from loudspeakers around Budapest. But when Andy and his schoolmates arrived at Heroes' Square, they discovered there was no crowd at all: The cheering was recorded. Six years later, when the Hungarian Revolution caused the border with Austria to be open for a brief period, Grove faced an immediate and unanticipated decision. He had never been outside Hungary. An only child, he would be leaving parents he might never see again. He had little idea of what he'd be running to. If ever there was a plunge into the unknown, that was it.

He arrived in the U.S. on Jan. 7, 1957--the same day that Time's "Man of the Year" issue featured THE HUNGARIAN FREEDOM FIGHTER on its cover. Soon he would change his name for a third and final time. At the City College of New York, where he enrolled, Andras Istvan Grof was struck from the transcript and above it was written Andrew Stephen Grove. He had left behind his home, and he needed a name people could pronounce.

The Self-Made Manager

By the late 1960s Grove had earned a Ph.D. in chemical engineering at the University of California at Berkeley and joined Fairchild Semiconductor, birthplace of the integrated circuit. When colleagues Robert Noyce and Gordon Moore quit to start Intel, Grove declared he was coming too. In 1968 they put their 32-year-old protege in charge of operations. That forced Grove into an unfamiliar role: having to lead people.

Quite suddenly Grove found himself on the shop floor of a manufacturing startup. There the human dynamics proved far more complex than the fluid dynamics he'd studied at Berkeley. The job, he quickly recognized, required something he knew nothing about: It required management. What was that, anyway? Grove decided he had to figure it out.

On July 4, 1969, he opened a school notebook and pasted in a clipping from a story in Time magazine about movie directors. "Vision to Inspire," it read. "Any director must master formidable complexity. He must be adept at sound and camera work, a soother of egos, a cajoler of the artistic talent. A great director has something more: the vision and force to make all these disparate elements fuse into an inspired whole."Above the clipping, Grove wrote with a red pen: "My job description?"

So began the self-education of Andy Grove, manager. It was a quest in which he immersed himself. His classroom would be a remarkable set of journals that he kept for years --and that have never, until now, been revealed. They're a window into the mind of an engineer grappling with the challenge of managing people. How did a company's growth rate, for instance, relate to its employees' ability to grow? In an entry from the early 1970s, Grove noted, "Three groups of people can be identified: (A) don't belong in their jobs in the first place. These are "defective choices," nothing to do with growth. (B) These are the previously discussed cases, people who can't grow with their jobs. (C) This is everybody else, including those that have demonstrated all kinds of growth capability before.

"The point is, there is a growth rate at which everybody fails and the whole situation results in a chaos. I feel it is my most important function (as being the highest-level manager who still has a way to judge the impending failure) to identify the maximum growth rate at which this wholesale failure phenomenon begins."

Grove succeeded where others didn't, in part, by approaching management as a discipline unto itself. There's real urgency in his efforts to school himself: He never lost his Hungarian refugee's apprehension of the risk of imminent failure.

The Change Agent

By 1983, when Grove distilled much of his thinking in his book High Output Management (still a worthwhile read), he was president of a fast-growing $1.1-billion-a-year corporation, a leading maker of memory chips, whose CEO was Gordon Moore. Could Grove and Moore save the company from an industry that was filled with ferocious competitors?

In many ways change was in Intel's DNA. It was Moore who had famously observed that the number of transistors you could cram onto a chip tended to double every couple of years (later refined to 18 months). What Moore's Law did not and could not predict was that Japanese firms, too, might master this process and turn memory chips into a commodity. That was change of a different order, and not even Intel was prepared for it.

The company's top executives simply could not believe the growing evidence that they were being outcompeted in a market they had created. Intel was the memory company, period. Its chips were in many of the best minicomputers and also in the new breed of machine that was then taking off, the personal computer. In the early 1980s profits from other products helped to sustain the delusion that memories were a viable future.

Intel kept denying the cliff ahead until its profits went over the edge, plummeting from $198 million in 1984 to less than $2 million in 1985. It was in the middle of this crisis, when many managers would have obsessed about specifics, that Grove stepped outside himself. He and Moore had been agonizing over their dilemma for weeks, he recounts in Only the Paranoid Survive, when something happened: "I looked out the window at the Ferris wheel of the Great America amusement park revolving in the distance when I turned back to Gordon, and I asked, 'If we got kicked out and the board brought in a new CEO, what do you think he would do?' Gordon answered without hesitation, 'He would get us out of memories.' I stared at him, numb, then said, 'Why shouldn't you and I walk out the door, come back, and do it ourselves?'"

The words "I stared at him, numb" suggest that in the crucial moment, Andy ceased to be Andy. Instead he was Dr. Grove the engineer, the teacher, looking down at his own case study. And from this realm of pure reason he could see that Intel's present course had an obvious ending: disaster. It was a cognitive tour de force, yet within moments Andy Grove the executive returned--and was dismayed by what Andy Grove the teacher had concluded. Professors overturn ideas, but they don't upend lives. "To be completely honest about it," Grove wrote, "as I started to discuss the possibility of getting out of the memory chip business, I had a hard time getting the words out of my mouth without equivocation." One of his managers even persuaded him "to continue R&D for a product that he and I both knew we had no plans to sell." Grove's devotion to reason did not mean that he was a machine. Far from it. What he found in the end was the will to do what was painful, the will to let go.

"Welcome to the new Intel," Grove said in a speech not long afterward, to rally the troops behind the decision to exit memories. Intel the memory company was dead, he explained, but there was another product on which it could stake its future: the microprocessor. Invented at Intel in 1971, it had spent the 1970s timing traffic lights and helping bacon packers slice their bacon into even strips. Not all that exciting. continued

But once IBM chose Intel's microprocessor to be the chip at the heart of its PCs, demand began to explode. Even so, the shift from memory chips was brutally hard--in 1986, Intel fired some 8,000 people and lost more than $180 million on $1.3 billion in sales-- the only loss the company has ever posted since its early days as a startup.

The Reality Shifter

Grove and Moore had no way of knowing that Intel was on the verge of a remarkable ten-year run. They did know they were betting the company--and that to make the shift they had to risk angering IBM. The $60-billion-a-year giant was not only Intel's biggest customer but also its biggest shareholder--it had bought a large stake in the company to shore up its shaky supplier.

Intel did not set out to dominate the computer industry any more than humans set out to dominate the planet. In both cases the main concern was survival. Humans were so vulnerable to being eaten by larger, faster creatures that their only hope of survival was to control their environment. The "new Intel," too, was subject to forces beyond its control. Grove would later use a graphic that depicted Intel as a castle with the 386 chip in the center. The castle was under siege by rival chipmakers Sun Microsystems, Harris, Motorola, and NEC, not to mention RISC. But in the mid-1980s, before the graphic was ever made, Intel faced a more basic challenge: It was not so much a kingdom as a vassal state. Its dominant customer, IBM, had long insisted that Intel license its microprocessor designs to other chipmakers so that Big Blue could always be certain of a ready supply of chips at a pleasant price.

Grove decided that had to change. "Finally, we had a real winner of a device," Grove says of the 386 chip. But if Intel wanted a more secure future, "we not only had to win; we had to win our way." The 386 marked a genuine milestone of computer engineering. As Microsoft and other software developers figured out how to make full use of the new chip, Grove knew, the PC market would probably grow even hotter. Yet as long as Intel had to share its designs with other chipmakers, it would always face the anonymous and uncertain life of a parts supplier, subject to the whim of a customer 60 times its size.

To become its own kingdom, Grove realized, Intel had to make itself effectively the sole source of microprocessors. Getting IBM to buy the idea posed a challenge--he had no way of knowing how his giant partner would react--but he knew the status quo did not give Intel the freedom it needed to grow. So Intel moved unilaterally: In 1985, when it launched the 386, it declared the technology would not be licensed to other producers. IBM at first did not build 386s into its machines. But as archrival Compaq picked up the chip, IBM came around, cutting a deal with Intel to make some of the 386s it expected to use in its own chip factories. The gamble had paid off. "To insist on our way meant we might lose," Grove says. "But to me, that is better than losing by compromising your advantages away."

The Fallible Human

During Grove's 11-year tenure as CEO, Intel grew at a compound annual growth rate of nearly 30%. Together with Microsoft, Intel supplanted IBM as the dominant standard in computing. In 1992, Intel's profits topped $1 billion for the first time--on $5.8 billion of sales. What made such extraordinary growth possible under Grove's leadership was his continuing ability to adapt to shifting realities--but even Mr. Strategic Inflection Point could stumble.

The 386 caught on, and sure enough, Microsoft used it to transform computing--its smash-hit Windows 3.0 operating system, which debuted in 1990, was designed to work on 386-based machines. Grove's breakthrough about changing the rules of the game opened the door to an epiphany about branding and marketing. In 1990 marketing chief Dennis Carter--the same Dennis Carter who had badgered Grove on RISC--came to him with a scheme to launch a large-scale consumer marketing campaign around the slogan "Intel Inside." It is hard to recapture how foreign the concept of branding was at an engineering company like Intel. According to Carter, when he pitched the idea to a roomful of Intel senior executives, "most of them thought it was nuts. But not Andy. He said, 'It's brilliant. Go make it happen.' " Improbably, it turned an internal component into one of the most recognized brands in the world. Grove so loved the idea of marketing to consumers that he selected the name Pentium himself.

There's a rate of growth, though, at which everybody fails, including Andy Grove. His biggest tumble from the learning curve began in 1994. That fall Thomas Nicely, a mathematician at Lynchburg College in Virginia, spotted "inconsistencies" in the way Intel's latest Pentium chip performed a rare, complex scientific calculation.

Intel engineers knew about the bug but deemed it too insignificant to report. By their calculations, a spreadsheet user would encounter it once every 27,000 years of spreadsheet use. But when Nicely's findings were posted on an Internet newsgroup, the discussion became a tempest, then burst into public view. Soon IBM announced it was suspending shipments of its Pentium-based computers.

It was a moment when Grove should have switched into observer mode and asked, "What has changed here?" Instead, he kept thinking like an engineer and waded into the online mob himself, as though it were purely a technical debate. The uproar grew, though, until Grove was forced to adopt a no-questions-asked replacement policy and to apologize to customers. The apology was not very gracious. "What we view as a minor technical problem has taken on a life of its own," he declared. "We apologize. We were motivated by a belief that replacement is simply unnecessary for most people. We still feel that way." In effect he was telling consumers that they wanted something they did not need, but Intel had decided to indulge their irrationality.

A customer replied on the Internet with a poem:

When in the future we wish to deride A CEO whose disastrous pride Causes spokesmen to lie and sales streams to dry We'll say he's got Intel Inside>.

For a man who strives to grasp objective reality, Grove had missed a fundamental shift in the nature of his business. Intel had become a marketing company. And while a chip is built in a factory, a brand is co-created with the customer. This required a rethinking of the meaning of "objectivity." In branding, a customer's subjective reality, even if confused, becomes your objective reality. The learning experience was more expensive than most: The Pentium recall required a $475 million writedown that marred Intel's year.

The Data-Driven Patient

A few months later Grove faced crisis again: He was diagnosed with prostate cancer. In the intense period that followed, he remained on the job for all but two-and-a-half days. He handled the decision about his treatment the same way he handled decision- making at Intel: as if life depended on it.

Grove had never been one to rely on others' interpretations of reality. Hungary, in this regard, served as How-Not-To-Do-It University. Reality there was shaped by one's position in the system. At Intel he fostered a culture in which "knowledge power" would trump "position power." Anyone could challenge anyone else's idea, so long as it was about the idea and not the person--and so long as you were ready for the demand "Prove it." That required data. Without data, an idea was only a story--a representation of reality and thus subject to distortion. Hungary had been a grotesque funhouse mirror. The slim man looked fat, and the fat man slim. But when he was diagnosed with prostate cancer in 1995, Grove found himself in the position of most patients: frightened, disoriented, and entirely reliant on the advice of doctors. Their advice was straightforward: Surgery was the best option, and that was pretty much all there was to it.

Was it, though? It took very little to discover that there was much, much more to it. There were alternatives to surgery. No surgeon advised him to take them seriously. But the expert opinions, Grove soon determined, were just that--opinions, based on little if any hard data. Data did exist. What Grove found most shocking is that no one had done the hard work of pulling it together. Plainly, Grove would have to do it himself.

The patient, in effect, became his own doctor. It was a massive research undertaking whose details Grove chronicled in a 1996 story for FORTUNE. One is left with the image of Grove, awake late at night, plotting and cross-plotting the data in his own methodically constructed charts. What did the data tell him? That he would be better off with an alternative procedure known as radiation seeding. That was the treatment he selected.

What Grove found most appalling, in the end, was the utter fixity of belief among doctors who failed to separate knowledge from conventional wisdom. Even the doctor who carried out Grove's procedure was captive to it. "If you had what I have, what would you do?" Grove asked him at one point. The doctor said he'd probably have surgery. Confounded, Grove later asked why. The doctor thought about it. "You know," Grove remembers him saying, "all through medical training, they drummed into us that the gold standard for prostate cancer is surgery. I guess that still shapes my thinking."

"Let's Think for Ourselves"

Grove stepped down as CEO in spring 1998 to become Intel's chairman. The betting at Intel was that he'd never really let go of the reins, but Andy surprised everyone. He dug into his new assignment as he has every other--setting out to examine and improve the way the board governed Intel and thereby to set an example for corporate boards everywhere (see "Inside Andy Grove's Latest Crusade" on fortune.com).

Last May, when Paul Otellini succeeded Craig Barrett as CEO, Grove officially became "senior advisor" to the company. The title didn't matter. Grove was still teaching.

On a Monday last month, Grove stood before 400 or so Intel employees, the advance troops of the company's health-care initiative. (Intel wants to make its chips the basic building blocks of 21st-century health-care and medical technology.)

Many had never seen Grove in person before, and he got a standing ovation before he said a word. His speech was a strong statement about strategy. Understanding comes from action. So "be quick and dirty," he said. "Engage and then plan. And get it better. Revolutions in our industry in our lifetime have taken place using exactly this formula. The best example is the IBM PC"--created on the fly by a team in Boca Raton.

Then he took questions. A European software engineer stood up with microphone in hand. He asked about handling health-care information. "How can we address the problem of privacy protection and data protection?"

"Stay with me for a minute," Grove said quickly. "Can I ask you a question? Why do you care?"

"Because health-care information might find its way to insurance companies and might result in higher insurance rates," the engineer replied.

"Explain to me why," said Grove, almost before the engineer could finish speaking.

"Many people have said it would be a bad thing if insurers knew all about the health history of everyone in the population," he replied.

Intel's senior advisor sized up the engineer's comments this way: "I think we have a tendency toward adding imaginary complexities to a problem which is already unimaginably complicated." He added, "Let's think for ourselves. Let's not repeat mindlessly ... excuse me, automatically ... suppositions that are true merely because somebody else says they are."

Did the engineer care about having been cross-examined and momentarily called mindless in the presence of 400 co-workers by his legendarily blunt leader? He smiled at Grove's choice of words. "Go ahead," he told Grove. "I was prepared."

FEEDBACK fortunemail_letters@fortunemail.com

RICHARD S. TEDLOW is a historian at the Harvard Business School. His book, The American: The Life and Times of Andy Grove, will be published next fall. For this essay he has drawn on his research, which includes interviews with Grove and many other executives, unpublished documents, and published reports.

[SIDEBAR]

GROVE OF ACADEME

A photo shoot and a chance encounter spark a $26 million gift to his alma mater.

Grove graduated from the City College of New York with a degree in engineering in 1960. He returned to campus for FORTUNE's photo shoot in August. Though no longer tuition-free, CCNY still offers an excellent low-cost education that draws many first- and second- generation Americans. Inspired by the visit, Grove gave the school $26 million--the largest gift in its history. The bulk will go to the undergraduate engineering school, which will be named after him. He spoke with reporter Kate Bonamici.

ON BEING ON CAMPUS AGAIN

We shot in a number of locations, and there was a lot of hurry up and wait. My daughter was with me, so I went around and showed her where my lab used to be, and as I was doing that, I was kind of warming up on the inside. It was like visiting the village where you were born or something. Somebody dragged up [freshman] David Bauer, the Intel Science Talent Search winner. So my heartstrings are pinging more and more. Other than the fact that he has no accent, it's me. He's from the Bronx; he takes the same subway I used to take to class. He's studying chemistry. It just struck me--the machinery of the American dream machine is turning. Almost 50 years later it's starting again, and David Bauer is there.

ON WHY HE DECIDED TO GIVE SUCH A BIG GIFT

They had asked that I make a substantial gift to the school.... I don't like giving money to an unspecified cause; I prefer to be more hands-on, so I never considered it. But the day of the shoot, I started to think, "Oh, maybe I should do it. Where else would I do it? Mumble, mumble, mumble." I repeated the visit--I poked my nose into labs and got lost and got directions from students. The students are different, and some of the buildings didn't exist when I was there. But the place is the same, which is what I needed to discern for myself.

ON HOW HE ENTERED CCNY IN 1957

I asked for the admissions office, and somebody sits me down, and I tell them my story. I was wondering what shoe was going to hit me in the head this time, but they accepted me with respect, without condescension. They gave me a start, and they gave it in a classy way. It's an institution that is crucial to the workings of America, and America should be proud of it. I am.

"That is not the right question," Grove will say--and push you to dig for the truth."I almost wrecked the company because a new product seduced me.""There is a growth rate at which everybody fails, [resulting] in a chaos."Grove's hard lesson about selling to consumers cost Intel $475 million.

Data to back up the long held argument for hypermobility?

December 1, 2005

Economic Scene

In Silicon Valley, Job Hopping Contributes to Innovation

By VIRGINIA POSTREL

FOR four decades, through booms, busts and bubbles, Silicon Valley has maintained an amazingly innovative business environment.

Companies and technologies rise and fall. Hot start-ups morph into giant corporations. Cutting-edge products become mature commodities. Business models change. Through it all, the area remains creative and resilient - and more successful than other technology centers, notably the Route 128 area around Boston.

What makes Silicon Valley special? Thanks to some new data, economists have finally been able to test statistically some popular explanations.

In her influential 1994 book "Regional Advantage: Culture and Competition in Silicon Valley and Route 128" (Harvard University Press), AnnaLee Saxenian, an economic development scholar at the University of California, Berkeley, argued that Silicon Valley's innovative edge comes from two unusual characteristics.

First, talented employees move easily and often to new employers, far more so than people elsewhere. "The joke is that you can change jobs and not change parking lots," one of her interview subjects said.

Second, instead of vertically integrating, Silicon Valley computer makers rely on networks of suppliers. They also design open systems that can flexibly accommodate all sorts of new components.

"The system's decentralization encourages the pursuit of multiple technical opportunities through spontaneous regroupings of skill, technology and capital," she wrote.

Many people, especially in Silicon Valley, found Professor Saxenian's argument convincing. But while her research was careful, it depended on interviews and had no large-scale statistical backing. Perhaps her subjects' impressions were unreliable.

After all, the argument that Silicon Valley's job hopping fosters innovation contradicts economists' common assumptions. "It didn't feel right to me," James B. Rebitzer, an economist at Case Western Reserve University, said in an interview.

When employees jump from company to company, they take their knowledge with them. "The innovation from one firm will tend to bleed over into other firms," Professor Rebitzer explained. For a given company, "it's hard to capture the returns on your innovation," he went on. "From an economics perspective, that should hamper innovation."

He found a possible answer to the puzzle in the work of two management scholars, Carliss Y. Baldwin and Kim B. Clark. In their book "Design Rules: The Power of Modularity" (MIT Press, 2000), they argued that when there is a lot of technological uncertainty, the fastest way to find the best solution is to permit lots of independent experiments. That requires modular designs rather than tightly integrated systems.

"By having a lot of modular experimenters, you can take the best, which will be a lot better than the average," Professor Rebitzer said. Employee mobility may encourage productive innovation, as people quickly move to whichever company comes up with the best new technology.

But you would not expect to find people moving around all the time in every industry, only those where technical uncertainty justifies spending lots of resources on experiments - including many that will not pan out. "In most other settings," said Professor Rebitzer, "it's going to be easier simply to design things with special purpose parts that fit in."

In a forthcoming article in The Review of Economics and Statistics, he and two economists at the Federal Reserve Board, Bruce C. Fallick and Charles A. Fleischman, empirically test the claim that Silicon Valley employees move more often than computer industry employees in other places. (The article, "Job Hopping in Silicon Valley," is available at www.federalreserve.gov/research/staff/fallickbrucex.htm.)

The two Fed economists, who are interested in the macroeconomic patterns created when people move from employer to employer, use data from the Current Population Survey. Until recently, economists almost entirely ignored such voluntary job hopping because they did not have good data to track it.

Since 1994, however, the Current Population Survey has asked respondents whether they have changed jobs in the last month. This question was added simply to cut down on repetition. Respondents who are still at the same workplace do not fill out questions about their employer. But the change created the first large database that tracks job changes. Since the survey is geographically based, it is ideal for examining job changes within the same area.

To Professor Rebitzer's surprise (though not his co-authors'), it turns out that Silicon Valley employees really do move around more often than other people. The researchers looked at job changes by male college graduates from 1994 to 2001. During that period, an average of 2.41 percent of respondents changed jobs in any given month.

But, they write, "living in Silicon Valley increases the rate of employer-to-employer job change by 0.8 percentage point."

"This effect is both statistically and behaviorally significant - suggesting employer-to-employer mobility rates are 40 percent higher than the sample average."

Computer industry employees in other California technology clusters also seem to switch jobs more often than those in other states. This result supports an argument made by Ronald J. Gilson, a law professor at Stanford and Columbia. In a 1999 article, he suggested that a 19th-century California law helped create Silicon Valley's hypermobility by prohibiting the enforcement of noncompete agreements.

In other states, businesses use these agreements to keep employees from easily hopping to other companies in the same industry. (That article is available at papers.ssrn.com/sol3/papers.cfm?abstract_id=124508.)

Finally, the economists test whether computer industry employees are more likely to move than employees in other industries, as the modularity hypothesis would predict. Again, statistical tests suggest that the theory is right.

Looking at cities within California, they write: "We find no evidence that outside the computer industry, job changes are more likely within Silicon Valley. Indeed, rates of job hopping appear to be lower in Los Angeles and San Diego than elsewhere in the nation."

Virginia Postrel (dynamist.com) is the author of "The Substance of Style: How the Rise of Aesthetic Value Is Remaking Commerce, Culture and Consciousness."

Copyright 2005 The New York Times Company

Wednesday, November 30, 2005

We have the technology. Now to find the willingness.

Nature 438, 548-549 (1 December 2005) | doi:10.1038/438548a

Science in the web age: Joint efforts

Declan Butler1

Declan Butler is a senior reporter at Nature.

At its best, academia is a marketplace of ideas. But many scientists are reluctant to embrace the latest web tools that would allow them to communicate their ideas in new ways, says Declan Butler.

When Tim Berners-Lee invented the World Wide Web in 1989, he saw it as a collaborative workspace for his fellow scientists at CERN, the European particle-physics lab near Geneva, and beyond. His creation went on to surpass his prediction that "the usefulness of the scheme would in turn encourage its increased use". But in the rush to develop the web as a flexible way to find information, the original concept of users interacting in real time was largely forgotten. Fifteen years later, the web seems to be returning to its roots.

For most users, the web in its first decade was like a big online library, where they mainly searched for information. Today it is undergoing a subtle but profound shift, dubbed Web 2.0, to become more of a social web, not unlike Berners-Lee's original vision. Yet scientists are largely being left behind in this second revolution, as they are proving slow to adopt many of the latest technologies that could help them communicate online more rapidly and collaboratively than they do now.

"I find it ironic that science is about the adoption, discovery and exploitation of new knowledge and techniques, yet the biggest revolution on the web is passing us by," says Greg Tyrelle, a bioinformatician at Chang Guan University in Taiwan. He has been experimenting with blog (short for web log) software for five years to interact with a growing audience of his peers and the wider public.

The emerging web is largely being shaped by dynamic interactions between users in real time. But many researchers still see publications in the formal scientific literature as 'the' means of scientific communication. Although the traditional published paper is accepted as the undisputed information of record, younger researchers, in particular, are concerned that scientists are missing out on new ways to communicate with each other and the public.

They recommend the use of collaborative technologies such as blogs and wikis, websites that any visitor can add to and edit. Supporters say these offer a forum for broader and more timely discussion, to complement the existing system of peer-reviewed journals. This could enhance science communication, both before publication, when generating ideas, and after publication, when discussing results (see 'Open house').

Blogs are just one example of new social technologies that are allowing more people to publish more easily and in more diverse ways on the web. By allowing reader feedback and syndication feeds, blogs create an instant online community. "Blogs can offer any kind of content — from peer-reviewed articles to sheer speculation to rants, and everything in between," says Amy Gahran, an expert in new media and editor of Contentious.com.

The write stuff

The best-known wiki is the online encyclopaedia, Wikipedia, which has grown to almost a million entries since its launch in 2001. Scientists at Harvard and the Massachusetts Institute of Technology (MIT) recently started their own wiki, OpenWetWare, to apply the same approach to sharing lab protocols and data among biology groups worldwide.

Outside academia, blogs are taking off in a big way. A study published in October by the Guidewire Group, a research firm in new media, says that 90% of marketing communication companies have either launched, or intend to launch, internal blogs. There are now some 20 million blogs, permeating almost every sector of society. But science is a glaring exception, and today there are still only a few dozen scientific bloggers.

Scientists who blog see their activities as a useful adjunct to formal journals, not a replacement. "The standard scientific paper is irreplaceable as a fixed, archivable document that defines a checkpoint in a body of work, but it's static, it's very limited," says Paul Myers, a biologist at the University of Minnesota, who blogs at Pharyngula.

"Put a description of your paper on a weblog, though, and something very different happens," says Myers. "People who are very far afield from your usual circle start thinking about the subject. They bring up interesting perspectives." By sharing ideas online, you get feedback and new research ideas, he says.

A senior US epidemiologist who blogs once or twice a day under the pseudonym 'Revere' on his public-health blog Effect Measure, has attracted a diverse readership. "About 1,500 people visit each day," he says. "If someone told me that I could show up at a lecture hall every day and deliver a short opinion, and that 1,500 people would show up to hear me, I'd be pretty satisfied — 1,500 is twice the subscription of many speciality journals."

But for most scientists and academics, blogs and wikis remain unattractive distractions from their real work. Many consider them an online version of coffee-room chatter, background noise that goes against the very ethos of heavily filtered scholarly information.

Opinion pieces

Scientists who frequent the 'blogosphere' see it differently. The dynamic hierarchy of links and recommendations generated by blogs creates powerful collaborative filtering, they argue. Blogs may create noise, but they are a great way of keeping up with what's hot in your field, says Tyrelle, who blogs at Nodalpoint.org. He believes that the more bloggers there are in a particular community, the more efficient this filtering becomes, so — counter-intuitively — reducing information overload.

Tyrelle suggests that this is not so different from BioMed Central's Faculty of 1,000, a popular fee-based service that highlights biology papers according to recommendations from a subset of 1,000 scientists. But in the blogosphere, this service is free and could marshall input from a subset of 10,000 scientists or more.

Yet even the most web-savvy scientists remain unconvinced that blogs have any useful role in science. "I have my doubts that blogging reduces information overload, but blogging will survive as it appeals to all the exhibitionists," quips Rolf Apweiler, a bioinformatician at the European Bioinformatics Institute in Hinxton, UK, and head of the UniProtKB/Swiss-Prot protein-sequence database.

Others disagree. "Science is too hung up on the notion of 'the paper' as the exclusive means of scientific communication," says Leigh Dodds, a web expert at the publisher Ingenta. Publication and research assessments are more geared to measuring a researcher's standing than communicating science, he claims.

Jennifer Hallinan, a biologist at the University of Queensland, Australia, who runs the blog Cancer Dynamics, agrees with him. The web is providing a hierarchy of sources, she says, including useful blogs and wikis. "Each level of the hierarchy has its own sources of error, its own strengths and weaknesses," she explains, "but these are known and can be taken into account when using them."

Blogs associated with traditional journals may help bridge the gap between the literature and blogs, says Glenn McGee, editor-in-chief of The American Journal of Bioethics. The leading journal in its field, it was the first to create a companion blog, Blog.Bioethics.Net.

The bioethics blog allows the journal to respond faster and in different ways to public controversies, says McGee. The blog has high impact, he adds, often influencing reporting on ethical issues by the mainstream media.

Print journals cannot keep up with developments in certain fields, adds Gavin Schmidt, a researcher at NASA's Goddard Institute for Space Studies in New York, who blogs at RealClimate.org with other climate scientists. The blog helps to reduce noise by setting the record straight, says Michael Mann, another RealClimate blogger and director of Pennsylvania State University's Earth System Science Center, citing as an example a recent post on whether hurricanes are linked to global warming (see http://www.realclimate.org/index.php?p=181).

McGee and Schmidt have permanent jobs, and both agree that many scientists don't blog because they fear it has a poor image and could damage their careers. Most younger biologists blog anonymously, says Roland Krause, a researcher at the Max Planck Institute for Molecular Genetics in Berlin and a bioinformatics blogger. "Many fear that their superiors consider it a waste of time, or even dangerous," he says. Schmidt agrees: "Until blogging is seen as normal, this will continue to be a problem."

Others fear being scooped by rivals. "In many institutes it's just way too dangerous to discuss work in progress with the people across the floor," regrets Krause — let alone on a blog.

Such fears are dated, argues Jason Kelly, an MIT graduate student involved in OpenWetWare. The upcoming generation, he says, believes that excessive competition can harm science; they see the benefits of brainstorming their research ideas on blogs as far outweighing the risks.

Kelly admits some may regard this view as naive. But Schmidt suggests that once scientists come up with some sort of peer-review mechanism for blogs that increase their credibility, without diminishing their spontaneity, blogs will take off.