You have /5 articles left.
Sign up for a free account or log in.

The Institute of Electrical and Electronics Engineers, the world’s largest technical professional organization for the advancement of technology, took flak Monday when one of its staff historians offered a crude critique of a book he later admitted he hadn’t read.

Such comments “insulted your scholarship, & embarrassed IEEE & my colleagues as well as me,” the outreach historian, Alexander Magoun, later tweeted at the book’s author, Safiya Umoja Noble, an assistant professor of communications at the University of Southern California. “Please forgive me. I will learn from this experience, & your book when it arrives.”

Noble, who researches digital media platform designs and their impact on society, recently wrote a book on how major search engines such as Google can exhibit the same kind of racism seen in society at large. It’s a topic she’s been writing and lecturing about for years, and her forthcoming book from New York University Press, Algorithms of Oppression: How Search Engines Reinforce Racism, has generated buzz among scholars of information science, machine learning and technology in general, and among sociologists.

Algorithms of Oppression doesn’t argue that search engines decide to be racist, but that they are programmed by and learn from people who pass on their own biases. And, perhaps most importantly, those learned biases are then passed on to people who use the internet.

“Run a Google search for ‘black girls’ -- what will you find? ‘Big Booty’ and other sexually explicit terms are likely to come up as top search terms,” reads a blurb about Noble’s book on Amazon. “But, if you type in ‘white girls,’ the results are radically different. The suggested porn sites and unmoderated discussions about ‘why black women are so sassy’ or ‘why black women are so angry’ presents a disturbing portrait of black womanhood in modern society.”

The blurb details one example of many discussed in Noble’s book. But Magoun, of IEEE, apparently took it as a literal invitation to run the experiment and googled “black girls” and “white girls.” Sharing his findings on Twitter from the @IEEEhistory account, he said he was not immediately convinced of Noble’s argument about embedded biases.

The response was fast and furious. A number of scholars suggested that Magoun seemed to be “mansplaining” Noble’s own argument to her, or at least feeling comfortable enough to publicly challenge years of research -- hers and many other scholars' -- by googling a few things. Others pointed out that search engine algorithms change all the time, and that what Magoun saw in one search might not be representative of trends. Some said his judgment was off, or even racist. And several followers said Magoun had inadvertently alerted them to the book -- and that they’d preordered a copy.

“Wow! Umm you do realize that Dr. Noble's work for the past 6+ years has likely impacted how Google approaches their search algorithms. Way to immediately disregard years of scholarly work by demonstrating the white patriarchal frame of your professional org,” read one tweet, for example.

Magoun, who did not respond to a request for comment, offered a few follow-up arguments, such as that he was questioning the marketing of the book if not the actual research. But a day later he apologized. He also said he'd ordered a copy of Algorithms of Oppression.

A number of critics have faulted not just Magoun but IEEE, a respected scholarly association of engineers and scientists, for allowing such a tweet to pass through a formal organizational account.

Monicka Stickel, IEEE spokesperson, said Monday that the message was “an unauthorized use of an IEEE account," which “is being addressed internally.”

Noble said via email that in a field where black women “are profoundly underrepresented, the tweet was an unfortunate incident. I accepted the apology with the hope that we can redirect our energies to the content of the book and its potential for impact.”

The more important issue, Noble said, “is the overwhelming evidence that platforms like Google, Twitter and Facebook are deeply impacting democracy. This is hardly controversial. What I offer is a study of technology on people who are already dealing with marginalization.”

Kevin Seeber, a foundational experiences librarian at Auraria Library in Denver who works with area first-year community college and university students, was among those who defended Noble’s scholarship on social media. Seeber said Monday that he still uses a 2012 article of Noble’s to teach graduate students in library and information sciences about algorithmic bias, and that he’s heard her present dozens of examples of such bias at academic conferences.

Saying that he and his colleagues in library and information sciences have been awaiting Noble’s book, Seeber drew a parallel between her work and the recent incident.

Noble's research looks at how search algorithms often “reflect the views, values, and biases of the people who create and use them," he said. Google’s search algorithm in particular incorporates user behavior into its relevancy ranking, “so what people think and do on the internet reflects what appears in the results. Consequently, the algorithm dismisses, objectifies or otherwise erases women, especially women of color.” And the comment from the official @IEEEhistory account calling into question Noble’s research methods and central findings, he said, “was a textbook example of dismissal.”

What made it so “galling,” Seeber added, is that the @IEEEhistory account “reflexively dismissed a scholarly monograph written by [a woman] of color when the book is exactly about that very topic -- the devaluation of women of color on the internet. The fact that the tweet's author evidently based this criticism on an Amazon blurb only further demonstrates the need for this book to exist.”

Everyone needs to be more aware of how the internet can “impact us differently, and that people who are already marginalized by society experience that same marginalization online, even more acutely,” he said.

Next Story

More from Diversity