Algorithms That Erase Our Stories

As algorithms increasingly shape what is seen, shared, and remembered, communicators are confronting a critical question about whose stories are being surfaced and whose are quietly erased.

Imagine me, a Black woman, in a roomful of communications executives discussing their AI strategies. I'm bothered by the same question. Who gets to be represented in these stories we’re telling?

In 2026, as we celebrate Black History Month, we're witnessing an unprecedented moment in which artificial intelligence is redefining the narrative itself. Yet, the algorithms shaping Black stories weren't built to see us. They were designed to replicate historical patterns that have long marginalized Black voices.

The Crisis We Can't Ignore

Most media coverage using AI systems will flag a story about a Black community leader's achievements as "less relevant" than coverage of crime in the same neighborhood. The system has learned the biases so well that it has automated them. This is a communications crisis. When we allow algorithms to determine whose stories matter, we outsource our moral responsibility to systems that can't distinguish between relevance and representation.

The Stories We've Never Told

I've watched Black girls and women internalize the message that their stories aren't worth telling. In communications, we understand the power of being seen. But visibility without belonging is simply surveillance. We need communicators who understand that representation is governance.

Three Questions Every Communications Professional Should Ask

  1. Whose voices trained your AI tools? If your content recommendation systems weren't built by people who understand Black cultural nuance, they're probably missing Black cultural truth.

  2. What stories are being erased? Audit your AI systems like you audit your stories. The algorithm that decides what gets amplified is often the same one that's been trained on biased data.

  3. Are you building belonging or just building a campaign? Inclusive campaigns mean nothing if they don't create space for authentic Black voices to shape your narrative strategy.

The Future We Can Build Together

I'm calling on communications professionals to recognize that in 2026, storytelling isn't just about what we say. It’s about whose voices are amplified, whose perspectives shape algorithms, and whose futures are being imagined in our content strategies. Let’s move beyond recognition to redesign. The communicators who will thrive in this AI-driven era are those who are discovering that when they center Black voices in their AI strategies, they don't just create more ethical systems; they create more accurate, resonant, and profitable content. 

As you craft your next campaign strategy, ask yourself, " Will this story create a sense of belonging? Will it show that all voices matter?In 2026, our greatest challenge isn't artificial intelligence. It's artificial inclusion. The communicators who build belonging by design, not by intention, will be the ones who shape not just narratives, but history itself.

Dr. Misty D. Freeman

Dr. Misty D. Freeman transforms belonging into action, using science-based strategies to inspire empathy, disrupt bias, and create inclusive cultures. Through coaching, consulting, and education, she equips leaders to break barriers and foster environments where all can thrive. A leading advocate for equity in technology, she tackles bias in AI, particularly its impact on Black Gen Z women, in her book Unconscious Algorithms. Her work redefines ethical AI and inclusive innovation. More than a thought leader, she is an innovator and disruptor, reshaping education, technology, and workplaces to ensure belonging is a lived experience, not just an idea.

https://www.drmistydfreeman.com
Previous
Previous

Epstein Files Force a New Reckoning for Reputation Management

Next
Next

Fighting Efforts to Erase Black History and Telling the Whole Story