×

I Believe Artificial Intelligence Will Not Accelerate the Dissemination of False Information

I Believe Artificial Intelligence Will Not Accelerate the Dissemination of False Information

Will Artificial Intelligence Really Amplify Disinformation? A Thoughtful Perspective

In ongoing discussions about the future of technology, many express concern that AI will significantly worsen the spread of false information. The argument is straightforward: as AI becomes more capable, it will generate vast amounts of disinformation at scale, flooding social media and other platforms with fabricated content.

The Concern: More AI-Generated Junk, More Disinformation
People worry that AI’s ability to produce content cheaply and rapidly will lead to an overwhelming proliferation of misleading narratives. Given the sheer volume of AI-produced “slop” or low-quality data points circulating today—particularly across social media—it seems intuitive to assume that disinformation will only become more prevalent as AI tools are adopted more widely.

Challenging the Assumption: Will It Really Increase Our Exposure?
However, I believe this perspective might overestimate the impact of AI-generated disinformation. Consider this analogy: if you spend time scrolling through TikTok or similar platforms, your engagement with content—regardless of whether it’s generated by AI or humans—tends to be limited to a certain number of videos, often around 100 to 150 short clips. Whether these are AI-created or human-made doesn’t significantly change the number of videos you personally consume.

Existing Disinformation Is Already Widespread
The amount of disinformation generated by humans over many years is already vast and difficult to fully consume. From political misinformation to conspiracy theories, much of our media environment is flooded with false narratives, regardless of AI. Introducing additional AI-crafted content into this mix doesn’t substantially increase the total exposure; it simply adds more data to an already overflowing pool.

Content Consumption Is Driven by Engagement, Not Quantity
My attention tends to be drawn to content that entertains or provokes. So, the distribution of what I see might be roughly one-third cat videos, some humorous falls, political emotional appeals, and a variety of miscellaneous content—regardless of whether AI has produced it. Over time, my exposure to disinformation remains relatively stable because I selectively engage with content that resonates with my interests.

Subtle Forms of Disinformation Are More Common Than Outright Lies
Disinformation doesn’t always come as blatant falsehoods. Many times, it manifests through editing techniques, misleading snippets, or manipulated footage—think celebrities or politicians speaking in a context that’s misrepresented. For example, a clip where someone says “Holy shit, he

Post Comment