Al Slop, Shadow Content and Digital Safety
Merriam-Webster’s word of the year for 2025 was slop, which the oldest dictionary publisher in the United States defines as “digital content of low quality that is produced usually in quantity by means of artificial intelligence.”
The dictionary also defines the related but separate terms: misinformation as “incorrect or misleading information”; and disinformation as “false information deliberately and often covertly spread (as by the planting of rumors) in order to influence public opinion or obscure the truth.” While both can also apply to information shared by individuals and organizations, governments and related political communicators are widely considered the primary practitioners.
Shadow content is different. In our definition it is “manipulated, deceptive or fake content issued around corporate, earnings or M&A announcements.” The term can also be broadly applied to “any content that shadows authentic content produced by corporations, media organizations or other parties with the goal of deceiving audiences into believing it is real and acting upon it. Fake press releases, media articles, images or deepfake videos issued in the wake of an official announcement or publication of a story, all constitute shadow content.
Slop can include anything from AI-generated images or articles to satirical political videos. Generally considered an evolution of spam, it is an amorphous term. Simply because AI has been used does not automatically mean content is bad. If we define “low quality” literally, it could be applied to lots of content, whether AI-generated or not. In the same way that spam would be considered an annoyance for some and a marketing tool (under the term email marketing) for others, slop evokes concern in some quarters and “meh” in others.
What matters most is that generative AI is enabling the creation, duplication, scaling, and distribution of content – some fake and designed to defraud - at unprecedented levels. In this context, shadow content is an important term because it narrows the focus to AI-generated content that is intentionally malicious.
As communicators are well aware, words matter. The difference between language that elicits a sense that something is relatively benign versus an immediate threat to reputation and bottom lines is the difference between inaction and action.
Communications leaders do not start discussions by saying PR is nice to have. PR and communications are discussed in terms of driving business results and protecting against reputational risks. In a crisis, the call to action is clear and direct - the company’s business, hence its future is at stake. A core Page principle underscores this: “Conduct public relations as if the whole enterprise depends on it.”
This is why the distinctions between shadow content and AI slop; misinformation or disinformation are consequential. With companies experiencing the growing threat of fake content designed to deceive clients, analysts and consumers, the lexicon communicators use should drive action. If AI slop is seen as a “ho-hum” issue or misinformation or disinformation as something governments do, don’t expect corporate clients to do anything.
In this context, shadow content is a precise term for specific reputational and financial risks – so it is most likely to drive action.
Problems need solutions. Content authentication provides a path to address the most fundamental issue created by shadow content – the ability to differentiate between authentic and fake content. This is key to the broader challenge of declining trust in digital content. If audiences don’t know what they can trust, trust overall is reduced.
Building provenance into each piece of digital content so audiences know the who, what and where of its production represents a fundamental step forward in the digital landscape. It allows clients and consumers to determine what is trustworthy and what is not – before they open or share it.
The tools for creating AI slop, shadow content, misinformation and disinformation are in the hands of the “good, bad and the ugly.” That genie is not going back into the bottle.
Addressing the new and emerging challenges of the AI Era is both a strategic imperative and a business opportunity. Using specific language to frame the challenge that leads to the implementation of solutions is critical. If you are looking into the opportunity to advise clients on how they can make content safer and more secure, you need to add shadow content to your lexicon.

