You have /5 articles left.
Sign up for a free account or log in.

Kids these days! If the technologies students use -- and sometimes abuse -- add up to an overwhelming jumble for some professors who teach them, John Palfrey and Urs Gasser have written a book that they hope will bridge the generation gap, at least when it comes to an understanding of the different habits, learning styles and ideas about privacy attributed to so-called "digital natives." Their book, Born Digital: Understanding the First Generation of Digital Natives (Basic Books, 2008), covers a lot of the territory mined at Harvard University's Berkman Center for Internet & Society, where Palfrey is a faculty director, and is part of its ongoing Digital Natives project. Palfrey, a professor and vice dean at Harvard Law School and Gasser, a professor of law at the University of St. Gallen, in Switzerland, and a Berkman fellow, answered questions via e-mail on whether professors should ban Internet from the classroom, the ongoing evolution of libraries, and whether students are learning differently thanks to new technologies.

Q: Whom are you writing this book for?

JP: The target audience for this book is parents and teachers. We think that these major groups have a great deal to offer young people in the way of guidance, but too often are not giving it to them. One big impediment is the fact that many parents and teachers perceive that they do not "understand" what their children and students are up to. We hope that this book may serve as a practical guide -- grounded in the best research we could find -- for parents and teachers of those Born Digital. Today, that means pretty much everyone in grade school and through higher ed in wealthy countries.

Q:Professors at some law schools have tried to ban Internet access during class, hoping students would stop multitasking and pay closer attention. Are efforts like this part of a losing battle? Are there ways for professors to "take control" of technologies their students already use for the benefit of the classroom experience?

JP: We are big fans of technology in the classroom, but we also believe that there are times when law teachers should say "laptops down." As with any highly disruptive technology, there are good uses of it and there are high costs associated with its misuse. At Harvard Law School, if one of the extraordinary Socratic teachers -- I am thinking of Elizabeth Warren or Bruce Mann, for instance -- wants no computer use in a first-year course, that makes all the sense in the world. If a cyberlaw professor -- at HLS, Jonathan Zittrain or Charles Nesson -- wants to be experimenting with technology in-room, that's great, too. There are hard questions associated with when and how to use technology in support of pedagogy in the law school curriculum. There's huge opportunity associated with digital technologies, but there are also equally big risks. Our argument is that we need to get in front of both.

UG: Judging from our own teaching experiences, bans are usually not a good way to improve the learning experience of students. To be sure, we agree that there are moments -- or even sessions -- where it's much better for learning to go offline, to close our laptops, and engage in an intense Socratic-style dialogue with the professor or a structured discussion with peers about -- say -- an important precedent. But a ban might be counterproductive and students can easily switch from using their laptops on the table to their iPhones under the table. Instead, educators might want to openly discuss with their students the tricky question of when and how information technology is good for learning -- and when not. We usually ask our students about their own learning experiences, what they think about the use of the Internet in the classroom, where they see benefits and challenges. Often, this sort of conversation is the first step for making a joint decision when to go offline in the classroom. But we should spend the same time and energy to think hard about good and productive ways in which the Internet can be used as a teaching and learning tool. Schools and universities should encourage their faculty to experiment with online tools in and outside the classroom, and to exchange their experiences -- the successes and failures -- with colleagues and students alike. This way and combined with the findings from psychological studies, among others, we'll develop over time some sort of best practice when it comes to the use of online technologies in the classroom.

Q:The intersection of education and technology is riddled with gimmicks that never worked and promises to revolutionize the way students learn. What are realistic expectations for the application of technologies to learning, and what are the potential advantages and disadvantages?

JP: Technology is never a panacea. And technology on its own can do nothing; it's just a tool for teachers and students to put to work in support of how they want to teach and to learn. A realistic expectation is that technology may be able to help support your pedagogical goals, but it's not going to (nor should it) do anything on its own.

A key advantage of using technology in education is that, through its use, we can give young people the digital media learning skills that they need. Right now, we are not teaching young people to sort credible information from less credible information online, despite the proliferation of sources and the extent to which we know young people are relying on such sources. Technology can also be very engaging and interactive and -- truly -- fun for young people to use as they learn.

The disadvantages could be many: over-reliance on the tools to do the teaching, potentially just a distraction, and used at the expense of sometimes better forms of learning (such as reading an entire book).

Q:Amid all the talk of students' loss of attention spans and the "sound bite" culture, there's a real discussion about whether digital natives are being raised to learn -- even think -- in ways that are different from previous generations. How much has this been studied, and should we worry?

UG: From what we've seen, there is little evidence that the Internet fundamentally alters the basics of "learning" as such. Remember, the Internet is a relatively new phenomenon when compared to the time it took to build out our brains as the basic human apparatus devoted to learning. While it would be surprising to see short-term changes in how learning happens through and in our brains, it's also quite obvious that the Internet has an impact on what we learn, how we engage in learning activities, and in what communicative contexts. For example, digital natives gather information -- as a building block in any learning process -- through a multistep process that involves grazing, a deep dive, and a feedback loop. Digital natives are good at grazing through the vast ocean of information online. While browsing the Web, digital natives might decide to go beyond the headlines of a story and to take a deep dive, for example by following a hypertext link, listen to a commentary, or download a video clip on the topic of interest. In this way, they are searching for what's behind the bit of information that got their attention in the first place. The feedback loop, finally, includes some sort of enhanced interactivity with the content they're interested in. The digital native, for instance, may decide to share the information with friends and family. Or to post a comment to her blog to critique the story he just learned about. Or to share thoughts on a mailing list. The form of a digital native's feedback loop varies, but his level of engagement with information and the world he lives in tends to be higher than the one of the previous generation. We're optimistic that these features are generally good for learning.

Of course, there are also dark sides that we need to take into account. No doubt, Internet addiction and information overload, for instance, are bad for learning. On balance, it's probably too early to make a final statement whether all of the changes associated with the digital revolution are good for learning or bad. There are many researchers out there -- neuroscientists, psychologists, pedagogues, and so forth -- to study these implications in greater detail. But it seems fair to say that there is plenty of reason to be optimistic -- and not to focus too narrowly on worst-case scenarios that so easily make for good headlines.

Q:Does being exposed to many and varying media, including multiple sources on the Internet, make students think more critically about the information they consume? Or are digital natives increasingly used to trusting what they see online, so much so that a "cut-and-paste" culture is becoming a threat to educational ethics?

UG: From our interviews, we've learned that many digital natives place high trust in the pieces information that they find online. Only a very small number of the kids we've talked to, for instance, are aware that the hugely popular online encyclopedia Wikipedia can in fact be edited by any Internet user. By and large, the kids we've spoken to assume that the Wikipedia is a credible source of information and that all entries are accurate. Younger kids in particular are relying on quite surprising clues to make quality assessments online. The color of a Web site, for example, or the amount of text displayed on a Web site are frequently used as indicators of the level of quality of information. It's rather obvious that these features are not necessarily reliable proxies. However, among the digital natives that spend a lot of time online, we see a different pattern emerge: They tend to be more skeptical when it comes to online information, and they usually visit more than one site to check whether the information found on a given Web site is credible or not. It's somewhat counterintuitive: The more time kids spend online, the better the skills to make sound quality judgments. But it's also important to understand that it's generally challenging for children to assess the quality of information, regardless of whether on- or offline. This has much to do with the ways in which children are hard-wired. Depending on their stage of development, their brains are not fully developed to make the same sort of careful evaluations that adults can do.

With regard to the "cut-and-paste" question: We can indeed observe an increased level of interactivity between digital natives and content when compared to older generations. Recipients are no longer passive receivers of information, but increasingly active users. The level of interactivity -- of what kids do with content -- ranges from simple cut-and-paste on the one end of the spectrum to much more creative uses on the other end -- including the making of mash-ups, where for instance video footage is combined with a song from a different source. While only a small percentage of digital natives use digital technologies in the most creative ways, we believe that the Internet has an enormous potential for creative expression that should be embraced and can lead to a participatory culture. To be sure, many of those forms of "doing things with content" have legal and/or ethical implications. It's therefore important to educate children about the basic dos and don'ts when they use online content for their own purposes. Educators and parents have to work together to engage our children in a conversation about information ethics and teach them about the principles of copyright law. It's important, however, that we teach our kids not only what they are not allowed to do, but also to show them what can be done with content in ethically sound and lawful ways. At the Berkman Center, we're currently developing such a balanced educational tool for children and teachers.

Q:How should libraries adapt to the changing ways that students (and faculty) do research?

JP: Libraries are adapting every day to changes in research methods. At Harvard Law School Library, we're just updating our Web site in response to extensive focus-grouping that the reference staff did with students. The site is oriented toward those research tasks that we know start in the digital world, much as a Google search is the first stop for many young people on their way to find information.

There's much more to be done, of course. One key is to figure out how best to acquire, catalog, and make e-resources accessible to users. Right now, most libraries are set up to do a great job in acquiring, cataloguing, and offering books for use to students and faculty members, but are not organized to handle e-resources. We need to teach students and faculty how to make use of both rivers and oceans of information. A lot of good innovative work is going into solving these issues. I'm sure libraries will adapt.

Another key area of adaptation has to do with the growing interdisciplinary nature of research and learning. More fields are becoming interdisciplinary, but libraries at universities are often stove-piped, much as the schools themselves are. So, we need to be offering research materials but also support for research methods, such as empirical work in law schools.

Q:What role, if any, should colleges play in educating their students in basic digital literacy?

JP: Colleges have a major role to play in teaching digital literacy, but I'm not sure there's anything basic about it. I think the things we need to teach are very hard. Some of them are about new research and learning methods. Others are about how people relate to one another and to information. We also should teach accountability, when it comes to copyrighted materials, as well as the rights that young people have to remake digital content in creative ways. This kind of education should not be a standalone "computer class," though -- it needs to be well-integrated into what young people learn as they go through the education system and into college.

Next Story

Written By

More from News