You have /5 articles left.
Sign up for a free account or log in.

The book cover for Alex Edmans's "May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases—and What We Can Do About It."

University of California Press

It’s mildly perplexing to start reading Alex Edmans’s May Contain Lies (University of California Press) only to find that it says little at all about lying. When it does, “lying” is construed in terms that are wholly unsatisfactory—not, I should make clear, misleading, but simply wrong.

Before going any further, it’s only fair to acknowledge that the book’s subtitle, “How Stories, Statistics, and Studies Exploit Our Biases—and What We Can Do About It,” is a fair indication of what’s inside. Edmans, a professor of finance at London Business School, takes a practical-minded approach to various forms of mangled logic, appeals to bogus authority and other commonplace forms of cognitive distortion. Studies show that most people will believe, by default, any statement prefaced with the words “studies show that …” (I made that up, but suspect that it’s true-ish). The human vulnerability to misinformation is perennial, but we keep finding ways to create new forms of it.

Besides identifying the problem, the author offers clearly formulated approaches to countering it. Every high school freshman should take a one-semester course with this as its textbook. Of course, that will never happen, and it will be kept out of many libraries; in any case, there needs to be a second edition that fixes the problem with the book.

And what is that problem? The author brings up the topic of lying just once. He writes,

“‘Lie’ is simply the opposite of ‘truth.’ Someone can lie to us by hiding contradictory information, not gathering it in the first place, or drawing invalid conclusions from valid data … Lies also have many causes—some are willful and self-interested; others are a careless or accidental result of someone’s biases; and yet more arise from well-intentioned but excessive enthusiasm to further a cause they deem worthy.”

This will not do. The author acknowledges that the word “is typically reserved for an outright falsehood made deliberately” but suggests his “wider definition” is somehow preferable, for reasons never made clear. A quick consultation with Etymonline shows that “lie” derives from an Old English word meaning “deceive, belie, betray.” The judgment of intent is not an afterthought. It is always worrisome to find oneself agreeing with George Costanza, but on this point he got things quite right: “It’s not a lie if you believe it.”

As it turns out, the author coins a perfectly sound and useful neologism covering the varieties of epistemic malpractice at issue: misinference. While May Contain Misinference would admittedly be a dud of a title, the book is built around a “Ladder of Misinference” described early on:

“We accept a statement as fact, even if it’s not accurate—the information behind it may be unreliable and may even be misquoted in the first place. We accept a fact as data, even if it’s not representative but a hand-picked example—an exception that doesn’t prove the rule. We accept data as evidence, even if it’s not conclusive and many other interpretations exist. We accept evidence as proof, even if it’s not universal and doesn’t apply in other settings.”

(All italics are the author’s, to which I have added boldface upon the first reference to a “rung” of Edmans’s ladder.)

Naturally the largest portion of the book consists of elaboration of these distinctions and illustration of their distortion through examples, including ones drawn from the author’s experience in the worlds of finance and academia. In some, his own misinferences were a factor, but the most memorable anecdote recalls his presentation at a conference as a graduate student.

The professor assigned to discuss his paper was a leading figure in their discipline who later served as president of its professional society. Things started well, Edmans writes, with the discussant “calling my idea ‘intuitively plausible,’ but then he said a key ingredient in my study ‘makes no sense.’”

The author nursed the wound to his ego as the esteemed figure went on to respond to the paper of a senior scholar—whose work was judged to have an endogeneity problem, which Edmans says he could recognize once it was pointed out.

“Virtually every other discussion” at the conference, he writes, “started off commending the question the researchers were exploring, but then explained why they hadn’t yet fully nailed the answer due to alternative explanations or other quibbles.”

In effect, researchers were challenging each other to determine what step on the Ladder of Misinference they reached before falling into error: “Doing so helps colleagues refine their ideas, rather than stabbing them in the back.”

The story is exemplary both of the universal susceptibility to misinference and of the ideal that a community of inquiry might rectify its mistakes. It’s unreasonable to suppose that public discourse in general could adopt such an ethos. But the author has produced a useful handbook on mental course correction.

Scott McLemee is Inside Higher Ed’s “Intellectual Affairs” columnist. He was a contributing editor at Lingua Franca magazine and a senior writer at The Chronicle of Higher Education before joining Inside Higher Ed in 2005.

Next Story

Written By

More from Intellectual Affairs