You have /5 articles left.
Sign up for a free account or log in.

The folks at Pew Internet & Technology and Elon University’s Imagining the Internet Center have a new report out, the eighth in a series of reports analyzing trends. They asked a cadre of experts a fairly simple question. When it comes to the spread of false narratives and unreliable information online, will our information systems get better or worse? The responses from over 1,000 participants were almost evenly divided. Slightly over half said things would get worse, and almost half said they would improve. Combing through responses, the researchers pulled out different rationales for these two positions, expressed in five themes.

  • People are the problem. They don’t want good information. They’re tribal, and they’ll gravitate to information that fits their preconceived notions.

  • Technology is the problem. When the profit motive trumps the public good, our tech giants’ solutions won’t scale or will simply create new problems.

  • Technology is the solution. With more sophisticated algorithms, perhaps incentivized by a new regulatory framework, bad information will be labeled or suppressed.

  • People are the solution. They will develop their own filters and will crowdsource trust as they adapt to new conditions.

  • Society will solve the problem by coming up with new funding models for good information created by professionals who care about the truth and by training people to know good information when they see it.

Each of these themes is thoroughly developed in the report (and, full disclosure, I haven’t had time to read it all yet – the PDF version runs to 92 pages).

It pairs nicely with last month’s report on how people approach facts and information. In that report, which is based on a fall 2016 survey involving over 3,000 US adults, nearly half of respondents were wary or distrustful of information sources and weren’t particularly interested in keeping up with news or learning how to evaluate information. Those most distrustful were more likely to be older, whiter, and less connected to the internet.

The slightly more than half of respondents who were interested in news sorted into three groups, decreasing in size: those eager to find out and learn more (more than half of whom are minorities), those confident that they can find and use information (well-educated and trending young and white), and those who were “cautious and curious” – a group that is interested in news and learning how to find good information but is distrustful of the media and the government and has little time or money to invest in finding things out.

One other interesting finding of this survey: Social media was reportedly the least trusted source of information overall, and libraries were most, but when asked what changes would be most helpful, better provision of library services fell to the bottom and having an unlimited data cap and a more reliable connection to the internet rose to the top. The group that most wanted both in almost equal measure was the majority-minority one which was thirsty for access to information however it’s delivered. There’s probably a bit of self-deception in this finding. I suspect we know we’re not supposed to trust social media – but that’s still where we spend much of our time and get a lot of our news.

Because I am an optimist by nature (or perhaps selfishly avoid dystopian visions of the future because I don’t want to get depressed and stressed), I think we will arrive at solutions to our glut of divisive disinformation that has blossomed in these anxious times. I don’t have lots of faith in algorithmic solutions, though I have to admit I get less spam email than I once did – spam was overwhelming email systems, so tech folks figured out how to take out the garbage for us. I suspect algorithmic solutions will work better at removing information that is similarly spammy – junk broadly blasted out to make money – than at tackling information warped for political gain. People will have to take out that garbage themselves, though I sense companies that enable an overwhelming avalanche of it are beginning to realize they have a branding problem. Eventually toxicity will have real costs. Twitter will have to be a less hostile and volatile place if it wants to keep or attract members. Facebook seems to be finally acknowledge that the public wants this giant to take some responsibility for its enormous public reach. Google wants to provide easy answers, but its brand takes a hit when embarrassingly bad information is promoted. The earth is not flat. The holocaust happened. Selling ads to Russian troll farms who set out to inflame American opinion is bad for business.

It could well be that a large percentage of Americans isn’t inclined to trust or seek out information, and this is probably nothing new. But the half that wants to follow news and learn more seems interested in solving the trust problem. Perhaps we’re all getting tired of the amplification of outrage that media, both social and traditional, profited from during a long and divisive election season. I don’t think rage will reliably attract our attention forever.

I suspect even the biggest monopolistic corporations that we rely on, like it or not, want to solve this problem if it can be seen as an interesting technical challenge that could boost their brand and fend off regulation. What they don’t have is the century or more that news reporting, scholarly inquiry, and libraries have had to develop standards that are human, not technical. But in time I think we’ll get there, one way or another.

However it plays out, I'm happy that Pew Research produces such interesting and useful reports on so many topics so frequently and at no cost to readers. When we’re thinking about the future of reliable information, they provide a terrific example of how to do it right. 

Next Story

Written By