After losing 211,000 followers to platform suppression, this Baltimore publisher turned his pain into doctoral research—and now a Congresswoman is fighting the same battle in Washington

(BALTIMORE – December 23, 2025) – When Pittsburgh Congresswoman Summer Lee declared this week that “discrimination is illegal when it’s done by a person, and it should be just as illegal when it’s done by an algorithm,” I felt something I haven’t experienced in years of fighting platform suppression: hope backed by power.

Lee’s reintroduction of the Artificial Intelligence Civil Rights Act isn’t just another piece of legislation. For those of us running independent Black-owned media outlets, it’s a potential lifeline against what I’ve come to call “algorithmic precarity”—the systematic ways social media platforms deploy carceral tactics against publishers serving Black communities.

I know this terrain intimately. Not just as a journalist who’s covered Baltimore and Maryland for 23 years through BMORENews.com, but as a victim of it—and now, as a doctoral candidate at the University of Maryland’s Robert H. Smith School of Business, as a researcher documenting it.

When Algorithms Attack

In one stroke, Facebook disappeared a page I’d built to 211,000 followers. Gone. No warning adequate to the loss, no real recourse, no human being willing to explain why years of community-building vanished because an algorithm decided our content violated some opaque standard. TikTok followed with restrictions that choked our reach just as we were gaining traction with younger audiences.

Shadowbanning. Throttling. What platforms euphemistically call “content moderation” but what feels—and functions—remarkably like the discriminatory practices of America’s criminal justice system. The same communities overpoliced in our streets find themselves overpoliced online, our voices systematically suppressed by code that somehow learned society’s oldest biases.

This isn’t paranoia. It’s pattern recognition. And it’s exactly what drove me from the newsroom to the doctoral program.

The Research Behind the Rage

My dissertation focuses on platform precarity and algorithmic discrimination against media publishers serving underrepresented communities. The academic language makes it sound abstract, but the lived reality is concrete: reduced reach means reduced revenue. Suppressed content means suppressed community voice. Algorithmic bias doesn’t just limit what people see—it limits who gets to shape the narrative about their own communities.

The data supports what Black media operators have known intuitively. Senator Ed Markey, who will introduce Lee’s companion bill in the Senate, cited research showing mortgage algorithms were 80% more likely to reject Black applicants. Another study revealed AI missed liver disease in women twice as often as in men. If algorithms discriminate in lending and healthcare, why would anyone believe they treat media content neutrally?

They don’t. And now, finally, Congress might do something about it.

What Lee’s Bill Would Actually Do

The AI Civil Rights Act tackles algorithmic discrimination head-on with three critical provisions:

First, it bans decisions that unfairly impact people based on protected characteristics like race, gender, or disability. This means platforms couldn’t hide behind “the algorithm decided” when their systems systematically suppress Black voices.

Second, it requires companies to conduct independent audits of their AI systems and take “reasonable measures” to prevent algorithmic harm. Imagine that—actual transparency about why your content suddenly stops reaching the audience you spent years building.

Third, and perhaps most importantly, it grants Americans the right to choose whether a human or an algorithm makes sensitive decisions affecting them. For content creators facing account restrictions or demonetization, this could mean actual human review instead of appealing to more algorithms.

The Federal Trade Commission would oversee enforcement, giving Lee’s bill real teeth. “We can’t allow companies to hide behind code when their products discriminate,” she said. “People deserve transparency, real oversight, and a human being who is beholden to civil rights law.”

Beyond Code: The Economic Impact

For Black Baltimore, this matters beyond media. When platforms suppress independent Black news outlets, they don’t just silence journalists—they strangle the ecosystem of Black businesses who advertise with us, the entrepreneurs we cover through initiatives like our Joe Manns Black Wall Street Awards, the community organizations who use our platform to reach constituents.

Algorithmic discrimination has economic consequences. When BMORENews loses reach, Black-owned businesses lose visibility. When our videos get throttled, community events go unattended. When our posts get shadowbanned, important local news never reaches the people who need it most.

This is why I’ve spent the past three years documenting these patterns academically while continuing to serve our community journalistically. Someone needs to connect the dots between the anecdotal experiences of individual publishers and the systemic nature of platform discrimination. Someone needs to translate lived experience into research that can inform policy.

Lee’s doing that work in Congress. I’m doing it in Baltimore and in academia. And we need more voices joining this fight.

The Carceral Internet

Here’s what keeps me up at night: The same logic that built mass incarceration—algorithmic risk assessment, predictive policing, automated decision-making that somehow always predicts Black people as higher risk—now governs online spaces. The digital world promised democratization. Instead, we got digital redlining.

Platforms don’t call it discrimination. They call it “engagement optimization” or “content ranking” or “community standards enforcement.” But when those neutral-sounding systems consistently reduce the reach of Black media, Black businesses, Black voices—what else can we call it?

Lee gets this. “If technology is shaping people’s lives,” she said, “then we have to mitigate the harms that it can cause—or in some cases are already causing.” Present tense. Already causing. Because it’s happening now, every day, to outlets like mine and thousands of others.

What Happens Next

Lee’s previous efforts, including the Eliminating Bias in Algorithmic Systems Act, stalled in Congress. This time might be different. Public awareness of AI bias is growing. More people understand that algorithms aren’t neutral—they’re trained on biased data by humans with biases, deployed by companies with profit motives that don’t always align with civil rights.

But legislation needs support. Maryland’s representatives need to hear that their constituents care about algorithmic discrimination. Black media outlets need to document and share their experiences. Researchers need to provide the empirical evidence that turns anecdotes into undeniable patterns.

I’m ready to do my part—as a journalist, as a researcher, as someone who’s lost too much to algorithmic suppression to stay silent. I’ve reached out to Congresswoman Lee’s office to offer my research and testimony. I’m calling on other affected publishers to do the same.

A Personal Note

Twenty-three years ago, I founded BMORENews.com to give Black Baltimore a voice in the digital age. I’ve interviewed thousands of community members, covered countless elections, recognized over 3,000 Black entrepreneurs through our awards program. I’ve built an archive of nearly 8,000 video clips documenting Maryland’s evolution—a historical record of communities often erased from mainstream narratives.

I didn’t build all that to watch algorithms destroy it.

Summer Lee’s AI Civil Rights Act won’t solve every problem facing independent Black media. Platform precarity runs deeper than any single piece of legislation can address. But it’s a start. It’s Congress saying that algorithmic discrimination is real, harmful, and illegal.

For the first time in years of fighting this fight, I feel like someone in power is actually listening.

Now we need to make sure they act.


Doni Glover is the founder and CEO of BMORENews.com and a DBA candidate at the University of Maryland’s Robert H. Smith School of Business, where his research focuses on platform precarity and algorithmic discrimination against media publishers serving underrepresented communities. He is available for interviews and testimony regarding algorithmic bias and its impact on independent media. Contact: doni@bmorenews.com

Share.
Leave A Reply

Exit mobile version