You have /5 articles left.
Sign up for a free account or log in.

On April 30 a year ago, Google announced:

“Today, we’re taking additional steps to enhance the educational experience for Apps for Education customers:

  • We’ve permanently removed the “enable/disable” toggle for ads in the Apps for Education Administrator console. This means ads in Apps for Education services are turned off and administrators no longer have the option or ability to turn ads in these services on.
  • We’ve permanently removed all ads scanning in Gmail for Apps for Education, which means Google cannot collect or use student data in Apps for Education services for advertising purposes.” 

This announcement presumably came about to forestall attention from the May 1, 2014, publication of the White House Big Data Report, which warned of potential abuses of student data privacy. For some years, and then with momentum in the recent months, Congress held hearings on the subject, parents were raising critical questions of how school districts were managing their children’s privacy, and research was coming out to suggest significant gaps between the Family Education Rights Privacy Act and practices of technological companies in the education space.

I did not then nor do I now feel assuaged by Google’s promise. First, their “don’t ask permission, beg forgiveness” approach had already become hackneyed observing how they navigated F.T.C. investigations for Street View, Buzz, and Safari By-Pass. Second, their “clever by half” approach to the issue of talking about ads when the most pressing issues are data-mining and profiling felt flat on this audience of one.  Finally, Google still has not presented any verification that on that day, or any time since, they stopped data-mining and profiling in Google Apps in Education (GAFE). 

Consequently, I have written a formal paper on this subject entitled, “Student Data and Blurred Lines: A Closer Look at Google’s Aim to Organize U.S. Youth and Student Information.” Eventually, I will present at a Berkman Center Forum on Student Data Privacy on May 20, eventually work it into a book on the culture, law and politics of the Internet in higher education that I am publishing through Cornell University Press.

It is important for people to understand that Google is an advertising company (ads are where approx. 95% of Google’s revenue comes from). Advertising is predicated on targeted marketing. Targeted marketing requires finely tuned, precise information about the individual to whom it is being directed to maximize the probability of a reply hit. Apart from monitoring behavioral acts conducted in real life (for which we now have wearables), what greater means of knowing who a person is and what they want or like than to mine their email and online activities?  Is it any wonder given the broad array of apps that Google sponsors to complement their monopoly on search that their commercial success rides on these fundamental business model facts?

I remember clearly the day back some years when operations personnel, legal counsel, and I sat in a room talking about Cornell University’s decision to get Gmail for students. We puzzled over how Google could provide it “for free.”  Naïve, too busy to think it all through, we locked on the notion that they sought out higher education to get youthful eyes on their site. As Google gobbled up more and more of the higher ed market, I began to be more curious about what else might be up the sleeve with that aggressive, competitive approach. I know now what motivates: the extraordinary profit garnered from information gathering, personal profiling, and advertising. 

What remains a curiosity is what, if anything, the federal government is going to do about it. Two weeks ago, the European Commission swung into action with a Statement of Objections regarding Google’s search business practices; it also opened an investigation of its Android products. Lingering questions remain about privacy, in Europe and in the United States, as is recognized by the fact that the F.T.C. voted against an investigation notwithstanding strong recommendation from staff to do so. 

In higher education – representing the education sector overall – FERPA legal obligations and ethical considerations of the particular vulnerability young people in learning environments heightens concern and should sharpen our focus. As teachers, administrators and leaders, we have a responsibility to treat this matter with the upmost seriousness and critical scrutiny. That responsibility means asking hard questions about what Google, as well as all other technology education services, are doing with information garnered from education records and students information. That line of inquiry demands greater transparency of business models and accountability of contractual promises. In short, it boils down to genuine, not rhetorical, informed consent.  And it requires that any “mistakes” of the past, such as the construction of profiles on existing or past students, be destroyed in a verifiable process. 

In a year of tremendous growth, we can celebrate the innovations of Google along with advancements in education technology. In the meantime, with one year under Google’s belt since announcing that it would no longer scan student data for advertising purposes, unanswered questions remain at best, deception at worst – and until these questions are addressed, educational institutions have reason to proceed with caution. 

Next Story

Written By

More from Law, Policy—and IT?