You have /5 articles left.
Sign up for a free account or log in.

Telltale fast clicks of laptop arrow keys gave away my distracted student from 30 feet off.

So engrossed was he in a 1980s role-playing game that he barely noticed when I leaned in to whisper how entirely inappropriate his behavior was during my digital humanities class at Dartmouth College. As a noted visiting technology and culture speaker held forth on participatory culture and Wikipedia — in which my students had expressed an avid interest — I was shocked as he and many others openly engaged with their Facebook pages.

Why would he play a game in a class he insisted he enjoyed? He had been playing the game before he went to bed, so when he opened his computer in class, "the game was just there, so I started playing," he explained to me during office hours. He didn't intend to check out from a class he likes. 

But he did.

This incident is particularly ironic as we had discussed in the previous class the myth of multitasking and dialogued about the Stanford University professor Clifford Nass. Now, some class members can handle themselves with their technology; about a third cannot. Multitasking makes us poor learners, studies show. It not only hurts the perpetrator by splitting their focus and attention, but it hurts those sitting around the multitasker and lowers everyone’s overall performance on each task. While millennials may think they have better multitasking chops than older generations, data show this assumption to be false. Unfortunately, science tells us the human brain is not meant to multitask and succeed — even if people truly believe they are good at it. 

As a digital humanities professor, I spend many waking hours communicating and creating on a computer. I deeply understand the exciting possibilities offered by digital tools because I spend a great deal of time designing them: in fact, I’ve designed a class that minimizes lecture time to create engaging activities involving mapping, making diagrams and hosting debates on virtual experiences. I also know there's a time and place to engage with actual people. There is a rising concern among faculty members across the country on the need to investigate classroom culture regarding technology so students might actually benefit. 

Clearly, the lure of the laptop is too compelling to resist. 

Some people might say we should use technology for activities that “flip” the traditional lecture class. In line with this thinking, I have run hands-on activities one to two times a week with great success. During those activities, students are focused and use technology to further learning.

But most days, there will come a time where faculty or guest speakers actually speak, or dialogue happens or provocative points are raised. It is then that students with technology-control issues immediately check out and check into Facebook or online games or shoe shopping. Unless they are directly involved in a hands-on activity for which they will be accountable in public by the end of class, it is much easier to give in to the presence of technology and lose the experience of direct engagement. 

Is this chronic narcissism? Or is this phenomenon a desire to escape the confines and taxing nature of concentration? Does this "checking out to check in" represent an insatiable need for immediate news in the "fear of missing out"? In 2013, an international team of researchers designed a way to measure fear of missing out and found people under age 30 were more affected, as were those who reported low levels of connectedness, competence and self-autonomy. This research supports earlier findings that the lonely and bored were more apt to rely on social media and feel left out if they miss out. 

Yet students also miss out if they cannot listen and engage with the world in front of them. The presence of technology in a class may not work unless the active learning is active all of the time or, I would suggest, our culture changes such that the live, in-person exchange is more valuable than whatever the student is doing with his or her own technologies.

In higher education, themes of dialogue, listening and presence are a core part of the college experience. We know from research that employing "embodied cognition" -- that is, learning from all of the senses-- is a more holistic and effective way to learn. There is so much to the human experience that is reflected in a face to face meeting, where body language, expression, tone of voice, and others around create a whole-body experience. If higher education continues its mission to support experiential learning, this may mean we must reestablish  forms of learning centered back in bodily experience, and lean a little less on technology to transform ourselves.

We need a culture change to manage our use of technology, to connect when we want to and not because we psychologically depend on it. Enough is enough. We need strategies for unplugging when appropriate to create a culture of listening and of dialogue. Otherwise, $20,000 to $60,000 a year is a hefty entrance fee to an arcade.

Next Story

Written By

More from Views