You have /5 articles left.
Sign up for a free account or log in.

Recent years have seen a flood of media attention devoted to the relationship between the digital age and the human brain (in these pages and elsewhere; The New York Times has a whole series on the topic). New Yorker writer Adam Gopnik divides commentators on the subject into three camps: "the Never-Betters, the Better-Nevers, and the Ever-Wasers" -- that is, roughly speaking, those who see the Internet and all that goes along with it as unambiguously good for humanity; those who think just the opposite; and those who "insist that at any moment in modernity something like this is going on," and the upheavals of our own time aren't so different from those of any other.

Cathy N. Davidson, author of the forthcoming book Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn (Viking) could safely be deemed a Never-Better, with perhaps a dash of the Ever-Waser. The major technological changes of the past decade and a half present an array of "exciting opportunities," Davidson argues -- opportunities to promote efficiency, satisfaction and success at every stage from kindergarten through career. If we are inclined to side with the Better-Nevers, worrying that our brains never evolved for shifts of such magnitude -- if kids attend to text messages and video games with alacrity, but fall behind in school, while adults feel swamped by information overload and spread too thin by multitasking -- the trouble, in Davidson's view, is not with all our new technologies, but rather with our failure thus far to adapt and restructure ourselves and our institutions.

In Now You See It, Davidson gathers data and anecdotes on a wide array of topics -- attention, learning, the American school system and its history, the modern workplace and how it came about -- to argue that the human brain is perfectly well-suited to the digital world, if only we are willing to rethink the classroom, the workplace, and how we measure success.

Davidson, who is Ruth F. DeVarney Professor of English and John Hope Franklin Humanities Institute Professor of Interdisciplinary Studies at Duke University, discussed her book with Inside Higher Ed via e-mail.

Q: Your doctorate is in English, while your faculty positions are in English and "interdisciplinary studies." How did you end up writing a book on "the brain science of attention"?

A: I was a math geek as a kid, and diverted from my first love of AI (Artificial Intelligence) and quantificational logic (my primary undergraduate major) when one of my mentors clued me in to the fact that there were virtually no women professors in the field at the time. I needed to go into something where I could find a job someday so I decided to get a Ph.D. in English. I know that seems comical, but it turned out to be a great choice, not only intellectually but because even now in the U.S. only about 10 percent of undergraduate degrees in computer science are awarded to women. That’s appalling, especially when you consider that 51 percent of the B.S. degrees in science and engineering go to women. It’s a cultural matter of national urgency, as is the underrepresentation of minorities in computer science and engineering.

Now, that may seem like a personal digression, but the kind of humanities scholarship I have practiced throughout my career focuses on the social and cultural rearrangements occasioned by and influencing the development of new technologies. That includes racial, gender and economic disparities. I entered the discipline of English at an expansive time, when British cultural studies, materialist history, American studies, and the history of the book were key to my training. My book Revolution and the Word: The Rise of the Novel in America analyzes the impact of the new technologies of printing, machine-made paper and ink in the dissemination of books in to middle- and working-class readers for the first time in human history. I looked at the late 18th- and early 19th-century creation of libraries and public schools, and the condemnations of pundits and Founding Fathers who were worried that the literary entertainments suddenly so popular among an uneducated populace could lead to shallowness, distraction, licentiousness, violence, mob rule, laziness, and an inability to prepare for an adulthood of productive labor. The novel was the video game of the 18th century. And that, perhaps, is the trajectory from “English” to new digital technologies.

The neuroscience interest was developed during my eight years as vice provost for interdisciplinary studies (1998-2006), when I was part of the team helping to create such programs as the Center for Cognitive Neuroscience, as well as ISIS (Information Science + Information Studies); the Center for Interdisciplinary Engineering, Medicine, and Applied Science; and the John Hope Franklin Humanities Institute, dedicated “to the idea that knowledge should be shared.”

Q: Can you briefly explain the "gorilla experiment" and its key implications?

A: Originating in the 1970s with one of the founders of perceptual and cognitive psychology, Ulric Neisser, and then reprised with modern digital cameras in 1999 by Daniel Simons and Christopher Chabris, this famous experiment is a video of six people passing a basketball, half in white and half in black shirts. Subjects are asked to count how many times the ball is passed only to and from those wearing black, not white, and then are quizzed on the number of passes they counted. What over half of subjects in a normal testing situation miss is a woman in a gorilla suit who walks in among the tossers for a full nine seconds, stares into the camera, and walks away. The experiment is designed to show us what we normally cannot see about ourselves: how paying attention in a focused way requires us to shut out everything else -- even a gorilla.

For neuroscientists, this is “inattentional blindness” or, more simply, “attention blindness,” a structural principle of the human brain. Newborn infants don’t know what to pay attention to. They learn from those around them what is or is not worth paying attention to through example, encouragement, reward or non-rewarding, as well as by actual instruction and language-learning. We learn how to pay attention so seamlessly and continuously that we are not even aware that we are always experiencing only a small part of what is going on around us. The more focused, concentrated or specialized (i.e., expert) we are, the more we miss. In Now You See It, I extend and address that basic principle of attention. Since some people do see the gorilla, we can construct collaborative situations among explicitly diverse, heterogeneous team members with different abilities, experiences, cultural biases, expertise, and intention, and then, together, enhance our ability to see things we normally miss. This happens to be the principle of heterogeneity used in open source web development, what developers call needing many “eyeballs” to find bugs in the system. It’s a method I call “collaboration by difference”: “I’ll count -- you take care of that gorilla.” With the right tools, the right partners, and with a dedication to open processes and diverse peer contribution, we can see better and more than we can on our own.

Q: You're critical of recent studies on the impact of multitasking. Why is that, and how do you think the topic could more fruitfully be studied?

A: In many of the hand-wringing and fear-mongering studies reported in the popular press, “multitasking” as a category can be ill-defined, vague, imprecise and subjective. I find it exasperating and belated. We now know there is no such thing as monotasking on a neurological level. Neurons are always firing and the brain is constantly chattering to itself, calling upon different areas at once to respond in ways we are only now beginning to understand. Lately, neuroscientists have begun to take ancient Eastern practices of meditation seriously, since those are founded in the principle that a quiet, resting, contemplating mind is ever-susceptible to distraction: that’s why meditation is a lifelong pursuit, not something you can do just by closing your eyes. The studies that try to make it seem as if multiple external distractions (new e-mail or digital environments) are somehow, in themselves, harming our very brains or shrinking our capacity to pay attention miss the point.

Of course these new practices change our brains in some ways -- that’s what learning is. And of course cultural tastes and habits (novels, movies, T.V., video games, etc.) happen all the time. Yet the gloom-and-doom studies make it seem as if we’ve never experienced multiple tasks at once before. They often measure the dire results either through self-reporting (which the gorilla experiment shows us is notoriously inaccurate) or controlled laboratory experiments that have little to do with how we live our lives. Ask an insurance adjuster and you’ll hear that, of course teens texting while driving can have accidents, but if you really want to protect your kid, don’t let him drive with friends in the car. Similarly, the distractions that most often lead to accidents are the fight with your lover, a tenure meeting, hearing a frightening medical diagnosis, or anticipating a job interview. We tend not to think of those things as “multitasking.” Yet physical and emotional distractions -- heartburn or heartache -- are far more distracting than anything the modern office can throw at us.

History is also useful here. Legislators wanted to prevent Motorola from putting radios in dashboards in 1930 because they thought they would lead to highway catastrophe. Now we know the distraction of the radio helps truckers counteract the monotony of long-distance driving.

When we say “multitasking is bad,” what we are really saying is that certain things are stressing us out and they are making us suddenly aware of behaviors that used to be so reflexive we didn’t even pay attention to them. We see the gorilla, as it were. That’s not always a bad thing. On the other hand, if the issue is that Americans are working too hard at our jobs -- and we’re now working more hours per year than our parents did or than their parents did (and more than anyone in the world except South Koreans), then we should be addressing that real problem. Work speed-up and overload has social, economic, political and indeed cognitive consequences. In this situation, multitasking is the smokescreen for a much larger societal problem.

Q: Why is it, in your view, that "our education system is slipping in comparison to our needs"?

A: Everything about our institutions of school -- from kindergarten to graduate and professional schools -- has been systematized, regularized, and standardized to maximize the form of productivity prized by industrialization. Whether you are a worker fulfilling one function on an assembly line or a middle-manager in a corporation, you have a job defined by others. From the mid-19th century onward, school has been increasingly designed to train us for a hierarchical Fordist model of efficient productivity based on expertise and position. Laws require us to start school at age 6 (whether ready earlier or later), schools start each day at the same time (that school bell!) and kids sit and even walk in neat rows, with learning divided into discrete subjects. With our national educational policy of No Child Left Behind, kids even take the same end-of-grade item-response multiple-choice tests, a form of testing for “lower order thinking” developed in 1914 to mimic the efficiency of the assembly line then producing Model Ts.

Around 1995, the Internet became available to the general public, and suddenly any kid with access to a computer can find information from anywhere on the World Wide Web, including a lot of information offered up by amateurs. Some of it is reliable, some ridiculous, and there’s no teacher or librarian in sight to dictate which is which. Google doesn’t yield A, B, C, or D, but all of the above. Kids need to learn the skill of assessing which of thousands of plausible answers to self-defined questions might be credible but we haven’t restructured formal education to this new way of learning. They also need to learn how to contribute reliably (and, of course, safely) online. Currently, we have a mismatch between our institutions of learning and the exciting informal ways kids learn online and, for that matter, all the new ways that, as adults, we all work, communicate, and learn together online -- distributed, process-oriented, collaborative, decentralized, peer-driven, crowdsourced. To put the matter in its most general terms: we’re educating youth for the last century, not the one we live in.

Q: "The question," you write, "isn't which is better, the past or the present. The question is, given the present possibilities, how can we imagine and work toward a better future?" Your book gestures toward massive overhauls of both the classroom and the workplace. In the face of such ambitious goals, how can we begin to work toward a better future -- i.e., what do you see as some achievable first steps?

A: First, it’s not the future. It’s the present that has changed. We have all gone through massive transformations of our work and social life in less than two decades and we’ve done an amazingly good job of it at one of humankind’s most dramatic moments. So the first thing we have to do is stop worrying so much. We’ve changed and we’re not Martians. The sociologists tell us that the cohort of students entering college this year are the least alienated, most family- and socially conscious, most politically engaged, friendliest, least drug- and alcohol-addicted, and least violent generation since World War II.

Once we can breathe in and relax about that, we can then see our ways to some really positive potentials. Kids love games. Let’s work on new and better ones and use some of the inspiring, challenging game methods to encourage learning, innovation, problem solving, collaboration, and the other skills that help us navigate the 21st century. In the workplace, let’s demand software that helps us sort out our personal life from our work life. The physical separation of leisure and labor is a hallmark of industrialism; now we need routines to sort the integrated information and workflow arriving on our laptops and mobile devices all undifferentiatedly and simultaneously, 24/7. We are 15 years in, so we’re right on time to begin thinking about how to make useful changes that will help us in our everyday life and work. Once we develop the tools, we can then work for equitable global work conditions, safety standards, saner ways of accounting for our (over)working hours, and other safeguards that workers fought for in the 20th century and that have pretty much been thrown overboard in the 21st.

Q: The book expresses some skepticism about the way that learning disabilities such as ADD are diagnosed and treated. Can you talk a little bit about that, and what you think a better approach might be?

A: If a quarter of students coming into our finest universities this year have been tested for, diagnosed with, or even medicated for a “learning disability,” then it is long overdue that we thought about what we mean by that term. Very few people have an attention deficit in all subjects. The same kid who can’t pay attention in math class might be up playing video games all night. So we need to think about how to make learning more enticing to more kids. When we narrow the curriculum, we also make the realm in which kids can achieve even smaller, meaning fewer kids with diverse talents are likely to achieve. That daydreamer who draws like a young Picasso? Without art in school, she’s just a loser. The brilliant electronics student who can rewire the whole family house is “slow” in a school without shop class or a computer lab where he can shine. There is also an increasing mismatch between the skills we measure as “achievement” in school and the skills kids learn at home online. So boredom and cynicism enter in. And as college costs more and more, we have the fatalism of those who know they will never be able to afford tuition anyway, so they give up before they can be disappointed. Finally, there is the issue of “economic disability” that to my mind is far more a national crisis than are learning disabilities. Our gap of rich and poor maps onto the map of educational achievement and failure with diabolical accuracy.

I’m not saying there aren’t physiological factors in addition to narrower definitions of achievement, the mismatch with the real ways kids learn outside of school, and economic disparities. Some kids are having a hard time sitting still, but that’s not a surprise given how much we police and restrict kids, and how little exercise they are allowed -- whether walking to school or on the playground (not to mention those who are given T.V. or the computer as a babysitter). Boys especially are falling behind, and zero-tolerance policies contribute to that. We’re also requiring them to spend more and more time in their seats and assigning more hours of tedious homework. It’s a recipe for inattention and therefore failure.

Q: What are some of the ways that you've applied ideas and research about attention and learning in your own classroom?

A: I rarely lecture anymore. I structure my classes now with each unit led by two students, who are responsible for researching and assigning texts and writing assignments and who then are charged withgrading those assignments. The next week, two other students become our peer leaders. Students learn the fine art of giving and receiving feedback and learning from one another. I structure midterms as collaborative “innovation challenges,” an incredibly difficult exercise which is also the best way of intellectually reviewing the course material I’ve ever come up with. In other words, more and more I insist on students’ taking responsibility for their learning and communicating their ideas to the general public using social media. If you want to learn more, you can find syllabuses and blogs on both the HASTAC and the DMLCentral site. I posted about “This Is Your Brain on the Internet” and “Twenty-First Century Literacies.” I also led a forum on interactive pedagogy in large lecture classes.

Q: What strategies would you recommend for other faculty members who might be interested in doing so?

A: In doing the research for Now You See It, I found really creative teachers and just ordinary people who had discovered inspiring new ways of learning together, and I pass those stories on to my readers. There’s Dennis Quaintance, a developer in Greensboro, North Carolina, who decided to “go green” and who gathered all his traditional, local contractors and crew together and set all a challenge to learn together and teach one another about sustainable development. His Proximity Hotel ended up winning the only Platinum LEED Award given to a hotel in the U.S. “The crazy thing?” he said when I asked him how he had accomplished this amazing result, “It wasn’t even that hard.”

We’re 15 years into a transformation in how we communicate and interact that historian Robert Darnton insists is the fourth great Information Age in all human history, beginning with the invention of writing in ancient Mesopotamia. We’re all learning how to do this together. The crazy, amazing lesson I learned over and over in writing this book? It isn’t even that hard.

Next Story

Written By

More from News