Title

Living in the Uncanny Valley

A review of James Bridle’s compelling new book, New Dark Age: Technology and the End of the Future

July 9, 2018
 
 

I can’t recall how I heard about James Bridle’s new book, but somehow I became aware of it and wrote to its publisher, Verso, and they obliged me with an advanced reader copy. I thought I might make a start on it when I had a short-haul plane trip, but by the time I got home the next day, I had very nearly finished it. It’s dense, demanding, and totally compelling, though if I only had the title to go by, I might have passed it up. I’m wary of books that have the words “future” in them, or “the end of” and have read too many dyspeptic critiques of technology that are full of sound and fury but are more incensed nostalgia for a fabled golden past than insightful critique. So if you’re like me, don’t be put off by the title, New Dark Age: Technology, Knowledge and the End of the Future. It’s very good.

Bridle is both a visual artist and a writer who grapples with the meaning of technology and how its reins seem to have slipped from our grasp, taking us to unexpected and eerie places. In this book James Bridle - New Dark Agehe explores those places, from the ways computation has made us doubt our senses, to the ways we use it to forecast the weather (but increasingly can’t because our predictions depend on a past that was, uh, predictable), how high frequency trading can obscure the workings of markets and create high-speed and senseless crashes, the rise of machine learning, mass surveillance, and the conditions that make conspiracy theories thrive. Above all, it’s about a present in which we’ve simply lost any way of knowing for sure because we have too much information. “The cloud” is full of data yet totally obscure.

My copy of this book is full of underlining and notes that climb around the margins. I feel like those overwhelmed organic chemistry students, highlighting everything because it all matters and it all connects, and yet ultimately the information world he’s describing - none of it makes sense. Which is kind of his point: “it is ultimately impossible to tell who is doing what, or what their motives and intentions are . . . Nobody decided that this is how the world should evolve – nobody wanted the new dark age – but we built in anyway, and now we are going to have to live in it” (p. 239).

(That bit of the title, by the way, isn’t some technophobes invocation of the fall of the Roman empire; it’s the last words of the first paragraph of H. P. Lovecraft’s “The Call of Cthulu” published in the pulp magazine Weird Tales in 1926. Seems appropriate.)

Among the underlined tidbits, I noted this: “Data is the new oil” was apparently coined a little over a decade ago by the owner of Tesco, a ubiquitous chain store in the UK, when describing the value of its supermarket rewards card – give up a bit of information about your shopping habits for a discount and, if lots of people do it, Tesco gets masses of valuable information about buying patterns. But people tend to forget what he really was saying: data isn’t useful until it’s refined and turned into something. Now it’s just one wildcat strike after another, so we're struggling through a vast slick of information that’s churning up things we don’t even recognize, that have used technology to turn information into something that is the opposite of knowledge.

Some try to rescue themselves from the constant barrage of information by grabbing hold of stories that seem to make sense – conspiracy theories are one response to having a great deal of information about events and little sense we can do anything in response: “The resulting sense of helplessness, rather than giving us pause to reconsider our assumptions, seems to be driving us deeper and deeper into paranoia and social disintegration” (p. 186) Or it drives us to capitalize on the network by creating nonsense content – YouTube videos that mash up keywords and brands to take advantage of out-of-control algorithmic exploitation in “industrialized nightmare production” (228). Yeah, think twice before you pacify your toddler with your phone. There's an ocean of deeply disturbing content out there turning nursery rhymes into ad-generating spookiness. And YouTube doesn't have a clue what to do about it. None of us do.

Our thirst for data, like our thirst for oil, is historically imperialist and colonialist, and tightly tied to capitalist networks of exploitation . . . Empire has mostly rescinded territory, only to continue its operation at the level of infrastructure, maintaining its power in the form of the network. Data-driven regimes repeat the racist, sexist, and oppressive policies of their antecedents because these biases and attitudes have been encoded into them at the root.

In the present, the extraction, refinement, and use of data/oil poisons the ground and air. It leaches into everything. It gets into the ground water of our social relationships and it poisons them. It enforces computational thinking upon us, driving the deep divisions in society caused by misbegotten classification, fundamentalism and populism, and accelerating inequality. It sustains and nourishes uneven power relationships: in most of our interactions with power, data is not something freely given but forcibly extracted – or impelled in moments of panic, like a stressed cuttlefish attempting to protect itself from a predator. (246-7)

Is there anything to be done? Well, we can try to see the shape of things that are all around us, yet somehow hidden, not just how these technologies work but “how things came to be, and how they continue to function in the world in ways that are often invisible and interwoven. What is required is not understanding, but literacy” (3) – and by that he doesn’t mean increasing digital skills, though that doesn’t hurt, but being able to think about where systems came from and what the consequences are. He urges us to consider “guardianship” that is “based on the principles of doing the least harm in the present and of our responsibility to future generations. Our understanding of "systems and their ramifications, and of the conscious choices we make in their design, in the here and now, remain entirely within our capabilities” (251-2). Well, that’s a relief. It’s also a tall order. Reading this book is a good start.

 

Be the first to know.
Get our free daily newsletter.

 

Back to Top