You have /5 articles left.
Sign up for a free account or log in.

As I mentioned last week, Mike Caulfield has written a handy (and free!) classroom-ready book about fact-checking and provides useful case studies for students and anyone who wants to fine-tune their bullshit detector. Also, he has explained why simply studying a document for clues (a checklist approach) doesn’t work and four moves you can make instead: corroborate, trace the story's origin, confirm (aka “read laterally”), and don’t get stuck in a rabbit hole (“circle back”). I have to also give a tip of the hat to Marc Meola who made a very similar point back in 2004, though we didn’t need it quite so badly back then.

The catch is that some folks have calibrated their bullshit detectors to click loudly if the source of information appears to be afflicted with MSM syndrome. It’s mainstream, so it must be liberal.  Or as the president likes to call it, “fake news.” These are news organizations that use gatekeepers, also known as new editors, and their job limits individual freedom to make up your own mind so that makes them all automatically the enemy of the people. The real people, not the people who are the enemy so don't count as people. Fact-checking sites like Snopes also set off these calibrated detectors because it finds too often the facts aren’t fake and the fakes aren’t fact.

Caulfield has since addressed this problem in “Media Literacy Is About Where To Spend Your Trust. But You Have To Spend It Somewhere.” It’s not that hard to get students to doubt. In fact, quite a lot of higher education is training in doubt. We critique, we take apart, we don’t take it for granted. We test our hypotheses. We reverse engineer ideas until bits of it are all over the floor. This means we’re better at distrust than trust. I’ve certainly experienced this problem when having students look at radically different conclusions drawn from primary sources or when pointing out how peer review processes are hardly fool-proof. Great, so now we can’t trust anything? Well, you have to trust some things, and luckily some people are more likely to use ethical and well-tested methods to arrive at conclusions than other people, whether in journalism, science, policy-setting, or simply making a decision.

If William Perry’s 1968 study that developed a model of intellectual development still holds water, college students in their late adolescence/early adulthood are at about the right age to be a living, breathing shrug emoji. (Maha Bali also makes this connection in a comment on Caulfield's blog post.) They’re past thinking things are clearly right or wrong. They’ve moved into a confusing time when everything is suspect. If nothing can be absolutely right, how are you supposed to trust anything? Isn’t it all a matter of opinion? Eventually, if things go well, they are able to commit to some things being more right than others, which is an epistemic shift – there are ways we can come to believe what we think is most likely right, which is not exactly the same as being able to think critically. It’s actually gaining trust in particular ways of knowing and caring enough to weigh options fairly.

This passage from the 1968 report jumped out at me, describing students in the relativistic phase.

Under stress (of fear, anger, extreme moral arousal, or simple overburden of complexity) it is possible to take refuge in the all-or-none forms of early dualism. At this point reactive adherence to Authority (the "reactionary") requires violent repudiation of otherness and of complexity. Similarly, reactive opposition to Authority (the "dogmatic rebel") requires an equally absolutistic rejection of any "establishment." Threatened by a proximate challenge, this entrenchment can call forth in its defense hate, projection, and denial of all distinctions but one. In this structure of extreme proprietary "rightness," others may be perceived as so wrong and bad as to have no "rights," and violence is justified against them.

Retreat is rare in our records [of the research project] and where it occurs it cannot be illustrated by concise excerpts. In recent years its structure is exemplified vividly in the forms of thought of the extreme "radical" left in student revolt. These forms may be examined in the statements of the "radical" as opposed to the "liberal" students . . . The forms are of course identical with those employed by persons and groups of the extreme radical right.

Certainly young folks in 1968 were facing stress and questioning authority, but retreat into dualism isn’t rare anymore and it’s not just a youth thing. What’s different today is our engines of epistemic reasoning are engineered by advertising technology. We Google it. We heard about it on Facebook or Twitter. We saw it on Instagram or YouTube. Every single one of those sources of information has an algorithmic editor, but this editor, unlike a traditional news editor, has no interest in news value and no wall separates him from the business side. In fact, he works only for the business side and will deny he’s in the news business at all. He’s only interested in ad placement, and he doesn’t even see what news is being shared because it’s happening so fast and hey, it’s not his problem. He only cares about the code; it's not his content, just his assembly of it.

The effect of this editorial work, though, has created an alternative news universe full of alternative facts and links to like-minded alternative "news" outlets and it’s all very profitable and all very ripe for mischief in the service of power and money.

We need to teach how to trust as much as we need to teach fact-checking. We can’t simplify things by saying “it’s peer reviewed” or “it’s from a major national newspaper” or “it’s from a university press” or “that’s from a scholarly society, so it’s good.” Peer review fails. National newspapers of quality get it spectacularly wrong. The recent shenanigans of the IEEE shows why trusting quality brands is problematic. First, in a Twitter post an "outreach historian" for the professional organization casually and sloppily discredited an African-American scholar’s work without reading it, then the organization first denied the historian (who apologized) was authorized to speak (though that was actually his job) and deleted the record of the dispute (not a good look for institutional history), then the society published a plagiarized article and removed it for review rather than acknowledge the problem . . . aaand it doesn’t help that the organization that insulted the work of two women scholars has a 90 percent male membership. There are any number of examples of highly respected organizations and publishers behaving badly. Information isn’t trustworthy simply because of its brand. (An aside: this is why using Journal Impact Factor to evaluate the quality of an article or an author is so bogus.)

Teaching trust means teaching the cultures and practices of good scholarship and good journalism and providing students with opportunities to practice honest, ethical listening and thinking about things that don’t have singular right answers but have ones that are demonstrably wrong. We also need to explain the ways our adtech-driven information systems actually work and how they are corrupting our common knowledge. I don’t see any way around it.

You can’t administer a trust vaccine in a fifty-minute library session. Where should this work happen? How do we make sure it does happen?  

 

 

 

Next Story

Written By