You have /5 articles left.
Sign up for a free account or log in.

Outrageous. Offensive. Outlandish.  Mendacious, malevolent, misleading. Dishonest, disingenuous, deceitful.  Skewed, slanted, shoddy. Those are just a few of the words that two Fox News critics used in 2007 to describe Rupert Murdock’s agenda-driven rightwing propaganda machine in their riotously funny, profanity-laced takedown, Fair and Balanced My Ass!  

A decade later, when the network dropped its 21-year-old iconic tagline, “fair and balanced,” in a bid to distance itself from the network’s sex-scandal embroiled architect, Roger Ailes, observers noted that while the network had shed its venerable slogan, it hadn’t forsaken its partisan editorial tilt, its biased, “blustering commentary masked as analysis,” or its “phony patriotism and piety, its dishonest crusades, its well-defined agenda, and ratings-driven techniques” that failed to meet even minimal standards of news gathering or commentary.

Today, in contrast, defenses of news media objectivity – reporting undistorted by personal beliefs, bias, feelings or prejudice -- are much harder to find.  

Consider an opinion piece, entitled “Newsrooms that move beyond ‘objectivity’ can build trust,” that just appeared in the Washington Post.  Leonard Downie Jr., the newspaper’s former executive editor and now a professor of journalism at Arizona State, agrees with those who maintain that the principle of journalistic objectivity was always “a distortion of reality” that resulted in “false balance or misleading “bothsidesism” in covering stories about race, the treatment of women, LGBTQ+ rights, income inequality, climate change and many other subjects.”  He argues “that truth-seeking news media must move beyond whatever ‘objectivity’ once meant to produce more trustworthy news.”  

Among those who agree with him is Kathleen Carroll, the former executive editor of the Associated Press, who said this about the idea of journalistic objectivity:  “It’s objective by whose standard?” “That standard seems to be White, educated, fairly wealthy.”  This idea – that objectivity is an outmoded, unrealistic, dangerous, and misleading illusion – has gained more and more traction, and, not surprisingly, has prompted intense blowback.

Downie acknowledges that “allowing journalists to express opinions on controversial social and political issues” can erode “the perception of their news organizations’ fairness and open-mindedness.”  But the thrust of his argument is that at a time when democratic institutions “are under attack,” reporters’ identity, values, and lived experience should be allowed to inform their coverage.

Yet Downie’s piece ignores a telling fact.  The United States today ranks last in public trust in the newspapers and television news, with distrust in the news media at an all-time high, and television news now the second-least-trusted institution in the country, following Congress.  

Whether distrust is cause or effect, Americans increasingly inhabit a fragmented news media environment, replete with biased or partial coverage, misinformation, disinformation, fake news, and alternate facts in which the boundary between opinion and objective fact is often blurry.

I would argue that the true divide within the news media is not between objective and activist journalism, but between writing that is truly analytic – like The New York Times columns by Thomas B. Edsall and David Wallace-Wells – and pieces that aren’t.  Interpretation isn’t at odds with voicing strong opinions, as Edsall and Wallace-Wells do, but it is grounded in fact and takes serious account of conflicting points of view, including the perspectives offered by academic experts.

Like journalism, our scholarly disciplines, too, are fractured into a dizzying array of subfields, while impassioned debate rages over whether activist scholarship is eroding quality and public trust.

Let’s briefly situate those developments in historical context.

As Michael Schudson argued in his classic 1981 social history Discovering the News, the idea of journalistic objectivity and a firm division between reporting and opinion is of surprisingly recent vintage, dating to the late 19th century, when The New York Times embraced an “information” model that contrasted sharply with the sensationalistic, highly partisan, story-driven model of the Hearst and Pulitzer tabloids. 

This was, of course, roughly the same time that academic historians embraced the Rankean ideal of an objective, positivist, evidence-rooted scientific history as they sought to make the field a legitimate academic discipline.  As Peter Novick has shown in That Noble Dream, his authoritative 1988 study of the objectivity question and the US history profession, these scholars embraced the ideal of objectivity as a way to distinguish their discipline from the newly emergent social sciences, which, even then, placed a much higher premium on theory, rigorous methodologies, and generalizable systems, frameworks, laws, and principles of explanation. 

Novick likens the profession’s subsequent 20th century history to a pendulum swinging between an ideal of history as scientific and objective and a contrasting conception of a usable past – a relevant history that seeks to serve the needs of the present -- a call first voiced by the literary critic Van Wyck Brooks, in 1918.

Novick’s overarching argument is that fact-based objectivity was never a plausible, attainable, or especially desirable scholarly goal.  The Civil Rights movements, the anti-war movement, and the women’s movement (followed by the Gay Rights, environmental, and subsequent movements) revealed the claims of post-World War II consensus historians toward objectivity and balance as a sham that disguised the true dynamics, forces, interests, and ideologies that drove American history.

Novick is no doubt right when he maintains that the tensions between objectivity and relevance, between science and usability, will never be resolved to anyone’s satisfaction.  Indeed, I’d argue that this tension is in fact generative:  It ensures that the humanities and social sciences maintain their critical edge, interpretive bite, and evidentiary base.  

But I would assert that the inverse of objectivity in humanities scholarship isn’t active engagement, it’s reinterpretation -- a willingness to vigorously engage with earlier and alternative arguments and interpretations.  No scholarly work is ever de novo.  A failure to recognize and grapple with earlier scholarly debates inevitably results in false claims of originality and ignorance and neglect of counterarguments.  I worry a lot that misleading assertions of novelty and ignorance of earlier scholarship are all too common today.

What Novick’s book reveals is that academic history is a never-ending debate, not so much over objectivity, but over interpretations that must, ultimately, rest on evidence.  The fact is that all serious historians strive – whatever they say -- to produce a history that is usable and relevant, that speaks to the present-day public’s needs and concerns, and that strives to connect past and present.  But the field disagrees passionately about how effectively that history is grounded in evidence and how well it explains the past.

History is not, of course, the only discipline in the humanities and social sciences that is torn between its scientific aspirations and its quest for contemporary relevance and impact.  Nor is the discipline distinctive in its hyperspecialization, its division into a bewildering number of disconnected subfields that barely speak to one another, or its fears of a loss of influence and scholarly standing.

Our big challenge, in today’s fraught academic context, is not to reassert the idea of the university as a ivory tower, insulated from today’s politics, nor to subordinate balance for the partisan needs of the moment.  Instead, we must ensure that the university – certainly the arts, the humanities, and the social sciences – provides an environment in which it’s possible to have compelling, open, meaningful conversations where the goal isn’t to convince students of anything in particular, but to help them think through contentious, highly charged issues.

There is scarcely a topic that I address in my classes that isn’t deeply divisive. Every issue I grapple with, every image I display, every primary source I introduce, is a potential trigger moment.  But my task, and yours, is to subject those topics, issues, and images to critical analysis that is somehow fun, engaging, and exciting.  

Our only agenda as classroom instructors is to animate discussion, provoke critical thinking, and leave students feeling that the academy is a space where the clash of ideas and viewpoints matters.

I’m sure you remember those ah-ha moments when you came across an insight or a different way of looking at something that transformed your outlook and sensibilities.  For me, key books often played that role:  Freud’s Interpretation of Dreams, Melville’s Moby-Dick, Nietzsche’s Genealogy of Morals, George Eliot’s Middlemarch, Keith Thomas’s Religion and the Decline of Magic, Barrington Moore’s Social Origins of Dictatorship and Democracy, and Eugene Genovese’s Political Economy of Slavery.  Suddenly, the world looked different and I was a changed person.

But most of those transformational moments occurred in a high school or a college classroom, when a new idea or argument or perspective exposed how narrow, blinkered, and impoverished my earlier thinking had been, when I suddenly realized that one could, for example, look at history through an economic or ideological or feminist or psychoanalytic lens, discover that supposed heroes’ motivations were mixed and complicated, that earlier histories had rendered whole groups of people invisible, and that the narrative of progress that I was brought up on wasn’t the whole story.

That’s education as it should be: eye-opening, mind-altering, unnerving, and transformational, but also broadminded, tolerant, and free-thinking.  That’s an education that balances engagement and passion, on the one side, and respect for nuance, complexity, and contradiction, on the other.  That balance isn’t easily achieved, but it’s absolutely essential if college is to cultivate students’ intellectual development and independent thinking.

In some respects, what we expect of students to get out of college is too much. We aspire to produce future workers, engaged, knowledgeable citizens, culturally and scientifically literate graduates, social justice warriors, and more.  All well and good. 

But there’s another message we should convey. That the real point of a college education, that four-year moratorium from the real world, is to challenge, inspire, help students grow up, prompt them to reconsider earlier ideas and assumptions, and depart as more reflective adults. As a Harvard professor told his students at the start of every semester:  Slow down.  Enjoy your time here.

Steven Mintz is professor of history at the University of Texas.

Next Story

Written By

More from Higher Ed Gamma