Saturday, September 24, 2005

More on what now matters.

Technology can do one of two things: enable us to focus on what really counts, or drive us to the point of distraction. Our systems of education had better be geared towards priming our population for the former.

Rote memorization always did manage to teach one thing: discipline.

---

From ape to 'Homo digitas'?

By Stefanie Olsen

Jonathan Zittrain has seen a new species emerging in recent years. He calls it "Homo digitas."

"The vision (is) of someone glued to a chair, focused on a screen, interacting as an object, a person whose main identification is as a digital creature, who doesn't know what to do without a signal," said Zittrain, co-founder of the Berkman Center for Internet and Society at Harvard Law School.

But for all the knowledge available on the Internet, it's not so clear that the modern, computer-using Homo digitas is any more intelligent than the good, old-fashioned Homo sapien.

Still, there are tantalizing signs of what could be. Communities of software developers, connected through the Internet, for example, have managed to create in a matter of months and at little cost what used to take big companies years and billions of dollars to develop. That collective intelligence of open-source projects shows how the world could get a lot smarter, thanks to the Net.

"Collectively and collaboratively, this is the most promising potential for really developing our collective ability to learn and think," said Doug Engelbart, a pioneer of personal-computing technology in the 1960s who conceived of the computer mouse.

But it's not so easy to say how or whether individuals are getting any smarter. Truth is, getting along in this world as a Homo digitas isn't easy. People must cultivate the ability to navigate dynamic, virtual environments for information, then be able to evaluate and analyze that information critically. On the Internet, it isn't always easy ferreting out fantasy from reality and truth from fabrication.

Just 10 years ago, if you wanted specific information you'd go to the library to check out a book. The fact that the book was in the library's collection meant that someone had vetted the work for credibility or value to society. The Web, on the other hand, holds few rules of selectivity or standards. Anyone can publish books, blogs, zines, videos or podcasts.

"The skill is moving around in a knowledge repository to...find out and learn things," Engelbart said. "It's one thing to ask a search engine a question. But it's another thing to go through and evaluate things that are relevant and tie them together."

Being able to organize all that data is also an important survival skill. No self-respecting Net denizen can get along without knowing all the advanced settings on at least three major search engines. And the ability to categorize e-mails in nifty folders while simultaneously tracking the windows of several instant-messaging sessions on the fly is pretty helpful too.

Shopping for ideas or products on the Net means people must process, compare and analyze much more information. Buying a pair of shoes off-line, a consumer might visit one or two stores. Online, people have the ability to compare product attributes in large numbers. The resulting glut of information puts a higher demand on them to put the information in their working memory--and process it, psychologists say.

Children must also become more skeptical judges of information, and in a sense, they must grow out of the sandbox more quickly. Years ago, kids were expected to tune into an authoritative voice in the classroom or elsewhere and paraphrase it to get an "A." Now children must apply critical thinking skills to sort through the vast amounts of junk on the Web.

"There is a shift that happens: If you start to lose attachment to facts in your head, if you always have to reach outside, then putting ideas together in novel ways will become impossible," said Brewster Kahle, executive director of the Internet Archive, a nonprofit organization that is keeping a record of the changes of the Net. "And that's the essence of thinking. We have to train and expect critical thinking--not just Web surfing."

Lost in a world of GPS

As a result, what kids do all day in school--indeed, how adults educate themselves, as well--may have to change. Until it does, many believe it'll be a long time before society becomes more intelligent as a result of the Net.

"Educating our kids and having them learn to educate themselves is completely changing, and I don't think we're ready for that in our schools," Zittrain said.

That's not to say people aren't getting a little smarter. A New Zealand researcher named Jim Flynn discovered in the 1980s that the average IQ test scores were ticking up by three points--a full standard deviation--every decade since the beginning of the 1900s. It's known as the Flynn Effect.

People are advancing in the ability to process information or problems, but they're not improving in verbal or critical thinking skills, according to several professors tracking intelligence. No one knows exactly why the average mean is trending up, but researchers suspect any number of things, including nutrition, experience with test-taking, and cultural attitudes.

On an individual level, however, it's much easier for people reliant on their computers to feel dumber when they're not online. Without a Google window open for quick answers, someone might be stumped in conversation, and that phenomenon may rise with the invention of IP-connected gadgetry like human-computer interactive eyeglasses.

Someone with a Global Positioning System in their car, for example, could rely on its directions to get to a friend's house repeatedly, without ever having to develop a theory for how to get there. Without it, he or she might be lost.

"Some people think (intelligence is) a single thing and it's the same forever and ever. And of course, it's not. Intelligence changes with time and place," said Robert Sternberg, dean of arts and sciences at Tufts University and a professor of psychology.

The good news is that the increasing popularity of blogs and wikis shows people are talking, arguing and forcing one another to think.

"People are not just idly sitting in front of the TV screen, but through some of these new technologies, (they) are asking questions of the world at large, and having the world respond and change because of the question," Zittrain said.

In fact, until computers can think for us, or thread ideas together, we will still need to rely on our own brains to do the work. The Internet may be vast, but it can't do the critical thinking for us.

"The Internet is information-rich, but it is flat," said John Davidson, a partner at venture capital firm Mohr Davidow who has specialized in investments in artificial intelligence. "The notion of technology taking over the world is false. It may be frustrating when the power goes out, but there are not going to be smart computers taking it over; it might (be) dumb computers. The ubiquity of stupid computers might be more dangerous.

Jeff Hawkins, the co-founder of Palm Computing, is working on that problem. He has started a new company called Numenta, in Menlo Park, Calif., in an effort to build intelligent machines that can replicate the brain's neocortex, the source of human intelligence.

In his book, "On Intelligence," Hawkins presents a theory of the brain that argues that intelligence is measured by the ability to make predictions by seeing patterns in the world. He's attempting to make computers intelligent by teaching them to find and use patterns in specific trades. For example, by programming a computer to "think" by watching patterns of visual images on a security monitor, a company might save on paying several night watchmen.

What if the power goes off?

"A real inflection point that's going to happen in the next three or four years will be when humans aren't the only ones exhibiting intelligence," Hawkins said.

Still, neuroscientists believe that humans are already smarter today because of technology and that as our culture evolves, our brains will continuously change and evolve. Mike Merzenich, a neuroscientist and co-founder of San Francisco-based Posit Science, a company that develops programs for brain fitness, has studied what's known as brain plasticity, the brain's ability to adapt and change physically and functionally throughout life.

"Our brains are different from those of all humans before us. Our brain is modified on a substantial scale...each time we learn a new skill or develop a new ability," Merzenich wrote in an e-mail interview. Still, technologists must be careful about developing computers that outstrip our own ability to think abstractly, thereby making people redundant.

But what happens if the power goes off?

E.M. Forster's "The Machine Stops," published in 1909, is about a society that's heavily dependent on a machine, which among other things, cleans house and provides the food. One day, the machine stops, and the society must reconstruct itself by relying on only a few people who remember what to do.

"The moment it gets switched off is echoed (in today's society) when the lights go down and we don't know how to fix the car or light the fire," Zittrain said.

Of course, the same could be said if phone lines go down, or the electricity goes out. The real danger is not being cut off from the Internet; it's that some people never get to use it and are at risk of falling perilously behind those who take Net access for granted.

"With the Internet and contemporary technology evolving at lightning pace over the past 40 years," Posit Science's Merzenich said, "the demands of uploading from our cultural history are incredible, and we're seeing more and more people falling off the boat."

0 Comments:

Post a Comment

<< Home