Overview:

A 16-year-old Black student-athlete, Taki Allen, was handcuffed at gunpoint by police officers at Kenwood High School in Baltimore County after an artificial intelligence "gun detection" system mistook a crumpled Doritos bag for a firearm. The incident has sparked outrage and raised concerns about racial bias in AI systems and the need for accountability for human decisions that led to the incident. The article calls for equity-centered approaches to safety, education, and accountability, and for community voices to be part of the solution.

I keep thinking about a teenager — Black, 16, a student-athlete — who did what kids do after practice: he ate snacks with friends and waited for a ride. Minutes later, eight police cars rolled up to Kenwood High School in Baltimore County. Officers drew their guns, ordered him to the ground, and handcuffed him. Not because he had a weapon. Because an artificial intelligence “gun detection” system mistook a crumpled Doritos bag for a firearm.

That child’s name is Taki Allen. His fear — and the fear of the students who watched—was real. Police body-camera video confirms officers discovered there was no gun, only a snack bag. Yet that didn’t stop them from treating him as a threat. What kind of world have we built where a computer’s bad guess can summon an armed response against a child?

Echoes of Trayvon Martin

If you hear echoes of 2012, you’re not alone. When America learned that Trayvon Martin was walking home with candy and a drink before he was profiled and killed, Black parents everywhere doubled down on “the talk” — that painful conversation every Black family knows: keep your hands visible, don’t run, say “yes, sir/no, ma’am,” and get home alive. Trayvon was 17 and unarmed. Those memories came rushing back the moment Taki Allen was forced to the pavement.

Let’s be clear about what happened: The AI vendor, Omnilert, issued an alert claiming to detect a possible gun. School administrators reviewed the footage and reportedly canceled the alert, but the principal — who is white — allegedly re-escalated the situation by contacting police anyway. Officers responded in force, guns drawn. Taki was detained, humiliated, and traumatized — all because an algorithm and an adult made the wrong call.

When racial bias gets coded into software, it reproduces injustice at machine speed.

Adding insult to injury, the principal reportedly called the student three days later to “check in.” For a teenager still processing fear and embarrassment, that call must have felt hollow at best, manipulative at worst. The family has since retained civil rights attorney J. Wyndal Gordon, who told Sports Illustrated, “He was treated like a criminal for holding a bag of chips… That’s not safety—that’s a failure.”

Baltimore County Public Schools recently dismantled the department that once might have provided oversight for incidents like this. The district now insists it will “look into” what happened. But that’s not enough. Where is the outrage from those in charge? Where is the public apology? Where is the accountability for a principal who allowed fear — or bias — to override judgment?

Technology Isn’t Solely to Blame

Technology didn’t make this happen alone. Human decisions did. Someone saw that false alert, hesitated, then chose to act. Someone decided that calling the police was safer than calling a parent. Someone decided to point a weapon at a child. That’s not artificial intelligence — that’s human irresponsibility disguised as innovation.

And there’s another troubling layer. The so-called “intelligence” in AI systems like this one does not exist in a vacuum — it learns from data. Decades of research show that predictive algorithms, facial-recognition software, and surveillance technologies disproportionately flag or misidentify darker skin tones and Black bodies as threats. When racial bias gets coded into software, it reproduces injustice at machine speed. The harm multiplies because it looks “objective,” when in truth it’s just prejudice in digital disguise.

What Accountability Looks Like

Community voices must be part of the solution. Leaders like Chrissy M. Thornton, president and CEO of Associated Black Charities (ABC), have long pressed for equity-centered approaches to safety, education, and accountability — approaches that protect children without criminalizing them. ABC’s own investigative work uncovered additional truths about the Kenwood High School principal — facts that district officials appeared to hide behind the veil of a “personnel matter.” Once again, transparency took a back seat to protectionism. That’s unacceptable. If an adult’s actions put a child in danger, the public has a right to know.

No child should ever again be terrorized at gunpoint because an algorithm.

This moment demands courage, not cover-ups. Baltimore County and school leaders should invite organizations like ABC, along with community and faith leaders, to conduct an independent, trauma-informed review that centers students and families — not bureaucracy. Real accountability must follow, including a public report on the failures in the chain of communication, a reassessment of the district’s AI policies, and mandatory training on racial bias in both technology and human response.

We’ve seen this story before. Trayvon Martin carried Skittles. Taki Allen carried Doritos. The packaging changes, the peril does not. “The talk” will continue in Black households because it has to. But this time, our response cannot stop with talk. It’s time for those in power — school officials, police, and policymakers — to face the consequences of their choices. No child in Baltimore County, or anywhere in America, should ever again be terrorized at gunpoint because an algorithm — and the adults who trusted it — couldn’t tell the difference between a snack and a threat.

This commentary first appeared at AFRO.com

Dr. Frances Murphy Draper is CEO and publisher of The AFRO-American Newspapers www.afro.com

The post {{post title}}, https://wordinblack.com/2025/10/taki-allen-baltimore-schools-ai/ appeared first on Word in Black