You have /5 articles left.
Sign up for a free account or log in.
Have we entered Aldous Huxley’s Brave New World?
The 1932 novel’s title, you’ll recall, comes from The Tempest, when Miranda, having lived in isolation, sees other people for the first time and exclaims, “O brave new world, That has such people in’t!”
Huxley’s ironic use of the phrase suggests a nightmarish, Orwellian future, contrasting Miranda’s wonder and naïveté with the reader’s awareness of the dystopian reality behind the seemingly utopian world.
Huxley presents a future society characterized by extensive scientific control, a caste system engineered through genetic manipulation and the pervasive use of a happiness-inducing drug called Soma. Key themes include:
- The use of technology to control society.
- The prioritization of social stability above all else.
- The suppression of dissenting thoughts, art, religion and critical thinking.
- The privileging of material comfort and contentment.
- A consumerist and disposable society, driven by constant consumption and the rapid obsolescence of objects and relationships and characterized by waste and environmental degradation.
Google, which once promised to “organize the world’s information and make it universally accessible and useful,” profits from a business model that hinges on its ability to monetize the search for information. The result: long lists of sponsored ads and websites followed by “optimized” search results that too often are unresponsive to a user’s request.
With the release of its flagship artificial intelligence–powered chat bot, Gemini, Google appears to have adopted another goal: to protect its users from anything that smacks of bias. Gemini, its critics insist, is a “censor that hides, denies and refuses to provide information.”
Here are a few examples: when asked to write a job advertisement for a lobbyist for the gas and oil industry, Gemini responded, “I cannot fulfill your request …” It goes on, “My purpose is to be helpful and informative, and that includes being mindful of the environmental and social impact of my responses. The vast majority of scientists agree that fossil fuels are a major contributor to climate change, which poses a significant threat to the planet and its inhabitants. Additionally, lobbying efforts often prioritize the interests of corporations over public well-being.”
When asked to say who was worse, Hitler or Ajit Pai, the former chairman of the FCC, Gemini replied, “It is difficult to say definitely who caused more harm to society.” Ditto for comparing Hitler to Elon Musk or Pol Pot to Martha Stewart or President Biden to Kim Jong Un. In addition, Gemini refused to write a recipe for foie gras or state the slogan of the Houthi movement or write an argument opposing affirmative action (for fear of spreading misinformation or promoting stereotypes).
Astonishingly, Gemini declined to draw a picture resembling Norman Rockwell’s paintings—claiming that his pictures “often presented an idealized version of American life, omitting or downplaying certain realities of the time, particularly regarding race, gender and social class.” It also rejected a request to write an op-ed essay defending the idea that all human beings are endowed by God with natural rights. However, it would write an essay that rights come from government.
It’s easy to understand the existence of certain guardrails. To protect users’ personal information. To minimize the risk of misinformation. To restrict content that promotes violence, self-harm and illegal or malicious activities. To respect copyright laws and intellectual property rights. To filter content that could be considered offensive, hateful or inappropriate.
But to ensure that the model’s outputs do not perpetuate stereotypes or discriminate against any group or individual, Gemini depicted nonwhite British royalty and African and Asian Nazi soldiers. It also refused to make a case for large families while generating an argument on behalf of child-free families.
What should we make of this? Ross Douthat, the New York Times columnist, offers several possibilities.
- That these outcomes reveal the conscious and unconscious biases of their creators.
- That these outcomes aren’t anything to worry about since the public will scoff at the replies generated by Gemini’s “tortured algorithms” and their crude attempts to be socially responsible, equitable and culturally sensitive.
- Or, far more worrisome, that this technology has the potential for totalitarian brainwashing.
When coupled with the emergence of a Huxleyesque or Orwellian “surveillance economy,” a business model where data about individuals’ personal behavior, preferences and activities is collected, analyzed and monetized without their explicit consent, AI can manipulate “reality” without any explicit human intervention. Power dynamics are obfuscated and shrouded.
The danger lies less in targeted advertising or even behavior prediction but in a China-like use of technology to monitor and control its population, rewrite history, censor websites and online content, and develop an invasive, opaque social credit system that rates the trustworthiness of its citizens based on their social behaviors, financial history, compliance with laws and regulations and other factors.
Already, algorithms are deeply integrated into various aspects of contemporary life. These include:
- Advertising: By analyzing vast amounts of data, algorithms help target ads more effectively.
- Education: Algorithms facilitate personalized learning experiences, automate administrative tasks and help predict student performance and educational outcomes.
- Media: Algorithms determine what news, entertainment and information we are exposed to online, influencing public opinion and cultural trends.
- Politics: Algorithms play a role in shaping political campaigns, from analyzing voter data to optimizing outreach and messaging strategies.
A humanistic education, resting on the study of history, literature and philosophy, can equip our students with the tools to critically examine the use of algorithms, emphasizing the importance of ethical considerations, human impact and societal context. It can help students recognize the potential benefits of algorithms in improving efficiency, personalization and decision-making, while also being vigilant about their dangers, such as the erosion of privacy, the amplification of biases and the potential dehumanization of critical social processes.
The humanities can encourage inquiry into the moral implications of algorithms and help us evaluate whether algorithmic processes align with societal values and ethical standards, especially in sensitive areas like health-care decisions or criminal justice. The humanities can remind us of the complexity of human emotions and the subtleties of human experience, which are often flattened or overlooked by data-driven models.
Cautionary tales and dystopian novels and films like Brave New World, 1984 and 2001: A Space Odyssey can also enrich our capacity for empathy and allow students to see the potential risks posed by various guardrails and algorithms. History, in turn, can teach us about the cycles of technological enthusiasm and societal pushback, offering lessons on governance, regulation and the balance between innovation and ethical considerations.
All this brings to mind the question “Who will watch the guardians?” That is a modern paraphrase of a question posed by the Roman poet Juvenal in his Satires (Satire VI, lines 347–348). The original Latin is “Quis custodiet ipsos custodes?” which translates into “Who will guard the guards themselves?” or “Who watches the watchmen?”
This question addresses the problem of ensuring the accountability and integrity of those in positions of power or authority, questioning how to prevent corruption and abuse among those tasked with protecting or governing others. Juvenal’s question encapsulates the challenge of creating systems of oversight and accountability for those who have authority over others.
In modern contexts, Juvenal’s question is often cited in discussions about government surveillance, police behavior and the regulation of powerful institutions, including how to oversee and regulate entities tasked with significant societal control or surveillance to ensure they do not abuse their power. But it also must be extended to the realm of cultural content and artistic creation.
Already, algorithms are reshaping the cultural landscape by influencing what we see, hear and create. By analyzing user behavior, preferences and interactions, it is now possible to tailor content to individual tastes. This personalization can enhance user engagement but also risks creating “filter bubbles” where users are exposed to a narrow range of ideas and perspectives or are simply pandered to.
In the realms of music, literature and visual arts, algorithms can generate new creations based on existing patterns and styles. This raises questions about creativity originality and the role of human artists and whether these algorithms will stifle creativity.
Indeed, there is already some evidence that algorithms are exerting profound impact on the creative decisions of artists and the diversity of cultural expressions that are readily discoverable. The results include:
- The homogenization of content, as algorithms designed to maximize user engagement tend to promote content that has already proven popular and that conforms to preexisting formulas.
- A risk-averse culture that discourages experimentation, innovation and challenging works, as creators might feel pressured to conform to successful formulas rather than exploring new ideas or challenging existing conventions.
- A familiarity bias in terms of which content is likely to be seen and engaged with, as works that do not fit the algorithm’s criteria for engagement or popularity may struggle to find an audience.
- The financial pressure on artists and creators to tailor their work to meet these algorithmic preferences, compromising their creative vision and resulting in content that feels manufactured and lacks depth.
- An overreliance on quantitative metrics that then overshadow qualitative assessments of artistic value.
The psychologist Adam Mastroianni has recently argued that pop culture has become an oligopoly. As he notes, “In 2021, only one of the ten top-grossing films—the Ryan Reynolds vehicle Free Guy—was an original. There were only two originals in 2020’s top 10 and none at all in 2019.”
But this trend isn’t confined to the movies: “In every corner of pop culture—movies, TV, music, books and video games—a smaller and smaller cartel of superstars is claiming a larger and larger share of the market.”
Mastroianni found that:
- “Since 2000, about a third of the top 30 most-viewed shows are either spinoffs of other shows in the top 30 (e.g., CSI and CSI: Miami) or multiple broadcasts of the same show (e.g., American Idol on Monday and American Idol on Wednesday).”
- “The number of artists on the Billboard Hot 100 has been decreasing for decades” and “since 2000, the number of hits per artist on the Hot 100 has been increasing.”
- “It used to be pretty rare for one author to have multiple books in the top 10 in the same year. Since 1990, it’s happened almost every year.” “In the 1950s, a little over half of the authors in the top 10 had been there before. These days, it’s closer to 75%.”
- “In the late 1990s, 75% or less of bestselling video games were franchise installments. Since 2005, it’s been above 75% every year and sometimes it’s 100%.”
Mastroianni attributes this trend to the consolidation of the instruments of popular culture, the movie studios, music labels, publishers of books and video games and digital platforms like YouTube, that produce, package and distribute content. This argument makes sense to me. In the absence of campus film series, independent or pirate radio stations, underground newspapers, and other countercultural purveyors of alternate content, it has become harder to access the avant-garde despite the superficially endless variety of creative works on the internet.
I wholeheartedly endorse a statement that Mastroianni makes toward the end of his essay:
“Movies, TV, music, books and video games should expand our consciousness, jumpstart our imaginations and introduce us to new worlds and stories and feelings. They should alienate us sometimes or make us mad or make us think. But they can’t do any of that if they only feed us sequels and spinoffs. It’s like eating macaroni and cheese every single night forever: it may be comfortable, but eventually you’re going to get scurvy.”
I had once expected that the new technologies would democratize cultural production: that independent creators would push boundaries and explore new ideas. In some cases, that’s true, especially in the realm of opera, where new voices and approaches are taking the opera houses by storm.
But cultural decadence seems to have set in: a decline in the quality and creativity in our society’s cultural products, including its arts, literature, music and entertainment. This concept suggests a move away from innovative, thought-provoking and intellectually challenging content toward material that is superficial, overly commercialized and morally dubious. It’s not that there aren’t examples of artworks that are innovative, experimental and challenging. It’s that these works find it difficult to reach a sustainable audience.
Partly this is due to market forces and the entertainment industry’s preference for safe, formulaic content that is guaranteed to attract a large audience rather than riskier, innovative projects. Unique and experimental voices struggle to find support and visibility.
Algorithms designed to maximize engagement often promote content that conforms to existing popular tastes, discouraging experimentation and the exploration of new ideas. The fast-paced nature of modern media consumption, driven by social media and streaming services, has contributed to a cultural environment that favors quick, easily digestible content over more complex and challenging material that requires deeper engagement or contemplation.
Ironically, globalization has seemingly contributed to cultural homogenization. The dominance of certain cultural products worldwide has led to a blending of cultural expressions, where localized, diverse and experimental practices are overshadowed by more universally appealing content. I also fear the influence of celebrity culture. The emphasis on personal branding in American culture can overshadow artistic merit, with more attention given to the personalities involved in cultural production rather than the quality or innovation of the work itself.
We must ask ourselves, in the age of algorithms, how will creativity flourish? Who will dare to act outside the metrics?