As a content moderator for a major social media platform, I quickly realized in just what a crisis our information ecosystems truly are. I saw close up the sea of endless junk that platform algorithms push in front of users, who then skim, skip, swipe, or scroll past without stopping to think deeply about much of any of it. And who could blame them? There’s just too much out there.
At the same time, I’ve watched as journalists and fact-checkers – the very front line of information integrity – are laid off across the news and tech industries, only deepening the crisis. If these trained and trusted human professionals are replaced at all, it is often by inscrutable black-box automation, as when MSN laid off its remaining journalists in 2020 in favor of proprietary AI. While staff reductions might save money in the short term, it comes at a heavy price downstream for users who find their information ecosystems increasingly polluted, and themselves dispirited by the sea of trash they have to swim through daily.
Seeing it all, I felt I had to do something, however small. That’s why I started a media literacy initiative unlike any other: the Lost Art of Reading Project (or LARP, for short). Most researchers don’t want to admit it, but most web users automatically tune out lectures, no matter how well-intentioned – even if they include a cool infographic. For example, explaining how misinformation works is never enough. Nothing changes. That’s why LARP takes a completely different approach than you usually see in this space: we fight fire with fire.
The Lost Art of Reading Project is a media literacy initiative creating AI-generated conspiracy theories to highlight the importance of human fact-checkers and journalists in online media. The project’s work has previously been recognized by Reuters.
One way we do this is by using AI technology ourselves to generate (harmless) conspiracy content. We then share this content to show people how easy it is for machines to produce seemingly credible information that is actually completely false. By demonstrating the limitations and pitfalls of AI content generation, we hope to encourage people to be more skeptical of the information they encounter online.
Our aim is to pique people’s curiosity, and engage their critical faculties directly in the moment, where it is needed most. We want to get readers to slow down, and really look closely at what is being presented to them. We want to trigger a kind of “immune response” feeling in readers, so they will say: “Look how fake this is!” The AI content we create becomes like a puzzle people get to feel good about solving, which as we have seen is half the draw of conspiracy content in the first place.
It’s plain to see that the rise of AI is a double-edged sword. On the one hand, it has the potential to revolutionize the way we create and consume information. On the other hand, it can be easily abused to create false information and conspiracy theories, which can be devastatingly effective.
Journalists and fact-checkers are going to be needed soon more than ever before. And every web user needs to become a fact-checker of their own merely to survive being online anymore. What’s worse, as AI content really takes off, machines will be able to create bad information more quickly than humans will be able to fact-check it.
As a result, the risk that we will then be tempted to build other AI systems to tell us what is true or not on the web is a very real one. It will be AIs acting as content moderators of other AIs. In most cases, the resulting systems are also likely to be closed proprietary tools built in service of for-profit companies, their investors, and governments – not all of whose interests will always be aligned with the average web user.
If we are to survive the societal changes that widespread adoption of AI will bring, we will need more humans in the loop – not fewer – to ensure the integrity of our information ecosystems, and the well-being of those whose lives depend on it.
For examples of LARP’s AI-generated conspiracies, visit Lost Books.