Brave-ish New World

Winter 2012

By Richard Barbieri

The broadly shared assumption about education today is that “the world” — by which we really mean “human culture” — is changing, and that schools must somehow change if we want to prepare our students for the life they’ll find beyond school. But enter the room in which we discuss the details of how we teach students for our brave new world and the agreement comes to a grinding halt — especially regarding the role of technology in schools and life.

In my reading, I have found experts in every relevant field — from computer science to neurobiology, psychology to anthropology, philosophy to the arts — who take diametrically opposite views on technology. The disputants divide along these lines: 

View A

Technology benefits democracy. The Internet builds community. Technology makes us smarter. Technology enhances our humanity. 

View B

Technology benefits tyranny. The Internet destroys community. Technology makes us stupider. Technology diminishes our humanity. 

The techno-optimistic see enormous positive effects on community, democracy, and knowledge. Clay Shirky’s Cognitive Surplus: Creativity and Generosity in a Connected Age, argues that “the harnessing of our cognitive surplus [the time we have to spend on interactive technology] allows people to behave in increasingly generous, public, and social ways, relative to their old status as consumers and couch potatoes.” To Shirky, the Internet can bring us out of the 50-year dark age, illuminated only by the TV screen, when, “in the space of a generation, watching television became a part-time job for every citizen in the developed world.” The isolation and passivity of television are now being replaced by the connectivity and creativity of the Internet. Already large efforts (such as Wikipedia) and small ones (such as Canada’s PickupPal.com, which matches drivers with passengers for carpooling) are showing the way. In Shirky’s future, our students will have more opportunities to do good than ever before, tapping into the natural human traits of generosity, fairness, and personal agency.

Shirky’s optimism is exceeded by David Eagleman’s in Why the Internet Matters. Eagleman believes that we live in a “fortuitous moment in history,” in which we can avoid the ills that led to the collapse of past civilizations because “we have developed a technology no one else possessed: a rapid, growing communications network that finds its highest expression in the Internet.” He reinforces his case by presenting it as an iPad app. In an interactive format — half verbal, half visual — that allows readers to choose topics from a wheel instead of progressing linearly, he offers topics like “Outpacing Disaster,” “Saving Energy,” and “Mitigating Tyranny.” Because we can communicate almost instantaneously, share data, and tackle problems with enormous human and computer resources, we can quarantine epidemics, work for democracy, and preserve all that has ever been thought and said forever. Again, good news for the citizens of the future, both near and remote.

But in the political realm, Evgeny Morozov, a former citizen of the Soviet tyranny, provides a radically different perspective. The Net Delusion argues that recent claims about Twitter and Facebook’s roles in democratic movements are wishful thinking more than demonstrable fact, and that tyrannies possess and are using the power of technology to their own advantage with greater and greater skill. What use are the Internet and cell phones if the government — say, in China and some Middle East nations — can shut down service, or worse yet, eavesdrop on dissenters and entrap them by knowing their plans and identities in advance? For observers of politics, Morozov provides startling examples of activists’ use of technology and regimes’ fast-growing ability to thwart them.

This information allows websites to tailor your Internet to your interests and beliefs, so you will visit sites often, stay longer, and thereby help these sites reap more advertising dollars. Google, for example, now returns searches based not only on a site’s popularity, but also on its knowledge of similar sites that you have visited in the past, so that “the query ‘stem cells’ might produce diametrically opposed results for scientists who support stem cell research and activists who oppose it.” Morozov’s grim outlook may not apply to future Americans, but Eli Pariser’s, in The Filter Bubble, already applies to us. Pariser maintains that the Internet has become a vast commercial system that knows more about each of us than we ever dreamed possible. One firm alone, he claims, “has accumulated an average of 1,500 pieces of data on each person in its database — which includes 96 percent of all Americans — along with everything from their credit scores to whether they’ve bought medication for incontinence.” Quoting the famous New Yorker cartoon in which one canine says to another, “On the Internet nobody knows you’re a dog,” Pariser retorts, “The new Internet doesn’t just know you’re a dog; it knows your breed and wants to sell you a bowl of premium kibble.” 

The problem, therefore, is not simply commercialization, but the restriction of knowledge. “In an age when shared information is the bedrock of shared experience,” he writes, “the filter bubble is a centrifugal force, pulling us apart.” (Pariser’s argument may explain why, for example, in an age of accessible information, more people now believe that President Obama is a Muslim than did during his 2008 presidential campaign.) 

Pariser ends with an exhortation and a list of strategies to diminish this hegemony of the search engines and social networking sites. One strategy he does not mention, however, is that schools have to work even harder at teaching students to read the biases in what they are told, and even in what they tell themselves — a task that makes earlier efforts to help students avoid the pitfalls of print and TV advertising seem primitive.

Promoting Internet self-defense is complicated by the possibility that the medium itself may be weakening our reasoning and concentration skills. That is the theme of perhaps the best-known Internet critique, Nicholas Carr’s The Shallows: What the Internet Is Doing to Our Brains. Carr begins by observing, “Over the past few years, I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory.” Noting the challenge he now finds in doing deep reading, he suggests “what the Net seems to be doing is chipping away at my capacity for concentration and contemplation.” He provides data on brain plasticity, and on the effects of computer use on focus and attention. Working with numerous studies and analyses, he makes a strong case that technology is diminishing our creativity, increasing our stress, and leading us to accept the fast and the superficial in place of deep understanding. For example, “research continues to show that people who read linear text comprehend more, remember more, and learn more than those who read text peppered with links.” Even serious scholarship has suffered: other studies show that despite the ease of finding research materials online, scholars now cite fewer sources, and limit themselves to more recent ones, than in the days of library research. Our other writers focused on the macro effects of the Internet; Carr gives us a very thorough look at the micro effects on us as individuals.

Lanier’s style is challenging, filled with philosophical and technological examples, but it is exciting to see a highly respected practitioner stating the case against (some) technology. And his exhortations are pithy enough to be remembered even by the unfocused: “Stop calling yourself a user. You are being used.” “You are not a victim…. Think about the world you want and how to get there.”Our last two books strike an even deeper cautionary note. Jaron Lanier, the only author among these who has made his career in technology, has written you are not a gadget: A MANIFESTO (the style is Lanier’s) to explain that “certain specific, popular Internet designs of the moment… tend to pull us into life patterns that gradually degrade the ways in which each of us exists as an individual.” He believes that the Twitters and Facebooks and IMs cause us to fragment and collate information and ideas into mindless mash-ups, rather than allow them to be “weighed, judged, and filtered by someone’s brain.” He further contends that when we constantly interact with computers instead of people, it is easy for us to begin thinking of ourselves as computing devices. Arguing against recent enthusiasm for “the hive mind” and “the wisdom of crowds,” he asserts “Collectives can be just as stupid as individuals — and, in important cases, stupider.”

Our last author, Sherry Turkle, has studied the effects of technology for 30 years, gradually moving from being “full of hope and optimism” (The Second Self, 1984) to “on balance a positive view” (Life on the Screen, 1995) to a deep concern in this year’s Alone Together: Why We Expect More from Technology and Less from Each Other. Like Lanier, Turkle focuses on the unique nature of personhood, and sees both our social media and recent “advances” in robotics as blurring the line between people and their machines. On the one hand, interactive robots are being used even today to provide “companionship” for the elderly and for children. (Turkle focuses mostly on school-aged children in the book.) Other visionaries propose that, in the future, “Love with robots will be as normal as love with other humans.” Already, one of Turkle’s interviewees “confided that she would trade in her boyfriend ‘for a sophisticated Japanese robot’ if the robot would produce what she called ‘caring behavior.’” As a psychologist, Turkle reacts with dismay: “A machine taken as a friend demeans what we mean by friendship.”

At the same time, our new social media create distance and impersonality between real persons. Many of Turkle’s young people say they feel safer typing IMs than talking either in person or over the phone. One college student texts her suitemate because knocking on her door would be too intrusive. Even adults have begun connecting electronically instead of personally: a sister objects that her brother posted his wife’s pregnancy on his blog long before calling family members; another man describes a guest at his dinner party blogging about the meal on her BlackBerry in real time. Social networking, on the other hand, causes us to “friend” a store, to be entered in a contest, and to accumulate more “friends” than we can attend to, so that, as one responder put it, “I’m processing my friends as though they were items of inventory… or clients.”

These are, of course, matters of concern, but Turkle offers hopeful signs, particularly from young people who themselves point out that humans should care for children and the elderly, that you shouldn’t break up with your girlfriend on e-mail, and that they wish mom and dad would put away the PDAs for just a little personal contact. 

In the end, I have to give the last word to Larry Sanger, co-founder of Wikipedia and the final voice in The Edge’s 2011 volume, Is the Internet Changing the Way You Think?, a compendium of over 150 points of view: “It is very hard for me to take the ‘Woe is us, we’re growing stupid and collectivized like sheep’ narrative seriously. If you feel yourself growing ovine, bleat for yourself.” 

Better yet, teach your students to be shepherds of their own minds and hearts.

Richard Barbieri

Richard Barbieri spent 40 years as teacher and administrator in independent schools. He can be reached at [email protected].