InfoWars host and conspiracy theorist Alex Jones made headlines recently when technology companies, including Facebook and YouTube, decided to remove his content from their online platforms.
These platforms deemed Jones in violation of their policies, such as community standards that prohibit content that glorifies violence.
Jones received renewed attention in the flurry of media coverage of Jones’ “de-platforming,” and subsequently, his InfoWars app became one of the most downloaded on Google and Apple’s app stores. (Jones’ app remains on the Apple app store, though his content has been removed from iTunes. His content was also removed from Spotify. Twitter did not remove Jones from their platform.)
Cameron Hickey, a researcher investigating disinformation at Harvard’s Shorenstein Center for Media, Politics and Public Policy, thinks journalists need to pay attention to this cycle, in which coverage of Jones’ removal from some platforms, which was spurred by journalistic inquiries into the place of his misleading content on sites like Facebook and YouTube, was followed by spikes in traffic to his remaining platforms.
Hickey came to the Shorenstein Center from PBS NewsHour, where he worked on the development of NewsTracker, software that collects and categorizes mis- and disinformation on social media. At the Information Disorder Project, he continues to track the spread of mis- and disinformation. The project also strategizes approaches for responding to mis- and disinformation.
“Yes, news organizations should be covering Alex Jones, an influential public figure with a large following who does a lot of things that certain people consider to be awful,” he said in an interview with Journalist’s Resource. “The platforms never would have de-platformed him if we [news organizations] weren’t covering and exposing that kind of stuff.”
But, Hickey said journalists should be thoughtful in how they cover Jones’ removal: “When you do have to cover someone like this in the context of them being de-platformed, it is worth considering whether or not you want to drive traffic on whatever platforms they still have left.
“The platforms came to a point where they said, ‘we’ve made a decision about what kind of content we want to reach our audience, and this content crosses a line, and we don’t want it to reach our audience,’” he continued. “I think… the news industry didn’t reckon with what our own role was in exacerbating the problem that the platforms were trying to solve.”
Hickey offered four suggestions for journalists to consider in their coverage of de-platforming figures who spread misinformation:
- Pursue reporting projects that hold platforms accountable.
“Twitter gave Alex Jones a time out … and that was in response to an investigation by CNN that exposed specific things that Alex Jones had said or done on their platform [Twitter] in the past that violated Twitter’s community standards. … CNN demonstrated that they could [hold Twitter accountable], not just by telling the public about it, but by shining a light on a mistake the platform was making in a way that forces the platform to act.”
- Consider the consequences of your reporting. Be mindful of when you decide to run a story and the ways in which you’re reporting on de-platforming.
“Holding the platforms to account, that’s really effective, but then as soon as you got them to make the change, thinking about whether or not you’re going to counteract that with your reporting as well.
“There are many newsworthy events where we decide the impact of reporting on them has more negative consequences than the value of our reporting on them … also the way in which the method of our reporting can be changed to increase or decrease those negative consequences.”
- Consider how references to questionable content, including images and links, are displayed.
“I saw various bits of coverage where they played clips of Alex Jones’ video stream, which clearly showed the full URL to his Twitter page. … Because their story was really about the fact that he was taken off this or that platform today, is it responsible to show that platform? I’d argue no.
“There are different approaches to thinking about how accessible you make content. In some cases, I’d argue you don’t provide direct links at all. If, in the nature of your reporting, it’s necessary, you can actually change the structure of a link so that search engines don’t follow them or don’t include them.” Since search engines don’t follow these links, they won’t get a boost in search analytics through your mention.
Hickey added that embedding the text of misinformation within an image is another strategy worth considering, as images are not necessarily indexed by search engines (this is not the case for text).
- But, take care when repeating misinformation in your coverage, even when debunking it.
“There’s research about how repeating a piece of misinformation even when debunking it has the impact of reinforcing a belief in that misinformation,” Hickey said. Many studies back his point, showing how attempts to debunk misinformation in the media are correlated with people continuing to believe those falsehoods and how fake news headlines labeled as contested become more believable through repeated exposure.
“They all go back to the same spot,” Hickey said. “If this story is about reducing the reach of a voice that’s corrosive in our society, then your coverage needs to respect that that’s the point of this.”
For more on mis- and disinformation, see our roundup of recent research on how it spreads. We’ve also published a glossary of key terms relating to information disorder.