Skip to content
Don MacLeod
Don MacLeod

22,000+ Wake-Ups Into This Lifetime

  • Home
  • Blog
  • Marketing
  • About
    • Notable Don MacLeod’s
    • Portfolio
  • Contact
  • Terms and Conditions
    • Privacy Policy
    • Anti-Spam Policy
    • Copyright Notice
    • DMCA Compliance
    • Earnings Disclaimer
    • FTC Compliance
    • Medical Disclaimer
Don MacLeod

22,000+ Wake-Ups Into This Lifetime

When Police Facial Recognition Shows Up Uninvited — Everyone Pretends They Are Surprised

Posted on December 11, 2025December 9, 2025 By Don MacLeod

I was reading that AP piece about Edmonton — the one where police facial recognition shows up like an uninvited cousin who somehow already has a key — and I had to laugh. Not a big laugh. More of a tired exhale you get when the pattern is so obvious the only mystery is who will pretend to be shocked first.

Edmonton police say they’ve been running comparisons through an Axon tool. Axon says they never approved facial recognition in the product. Everyone swears they only learned about this after someone else told them. And the whole thing has that faint smell of plausible deniability — like when airlines claim they “value your comfort” while handing you nine square inches of legroom and a packet of peanuts so small you need tweezers.

The absurd part? Nobody seems to know who flipped the switch. Or that’s the story — because that’s the move with tech creep: claim surprise, issue a statement, and act like the AI installed itself overnight.

Yeah… that tracks.

A Subheading That Actually Says the Quiet Part: How Police Facial Recognition Slips In

There’s a rhythm to this stuff. First, someone adopts a “harmless” tool — mostly a camera, maybe a database. Then a vendor adds a feature “for testing.” Someone somewhere quietly uses it because it’s there — and, well, cops are like the rest of us when presented with shiny buttons. Suddenly the city realizes it’s running AI matching on citizens who definitely didn’t sign up for this.

But here’s the part that got me — Axon distancing itself so fast you could smell the rubber on the pavement. They said facial recognition isn’t approved in their systems, Edmonton shouldn’t have enabled it, and they’re launching an internal investigation.

Which is adorable — in the way watching someone try to put toothpaste back in the tube is adorable.

Mid-sentence reset here because the thought keeps circling: the company says it didn’t ship facial recognition, yet the software allows an officer to upload footage and get a match. So either the system grew a brain stem over the weekend or somebody knew this was happening and waited for the PR blowback before remembering the ethics guidelines.

The Accountability Gap You Could Drive a Zamboni Through

Nobody wants to own AI mistakes — especially in policing, where “oops” isn’t a cute word. It’s arrests. It’s lawsuits. It’s people getting flagged by algorithms that don’t even know what accuracy means outside the marketing deck.

So you get this dance.
Police: “We thought it was allowed.”
Company: “We never said that.”
Public: “Great, so the AI is running on vibes now?”

And the language in stories like this — the hedged statements, the vague descriptions, the sudden “we are reviewing” — it all signals the same thing: the machine is ahead of the paperwork. Happens every time a tool meant for storage or convenience or officer safety suddenly becomes an identification engine.

The creep is the story.
The denial is the punchline.

Edmonton Is the Example — Not the Outlier

This one city got caught in the spotlight, but it’s everywhere. Cities “ban” police facial recognition while quietly using vendor tools that connect to third-party databases. Vendors say their products don’t include the tech — then someone uncovers a buried feature. Officers swear they didn’t know it was running — because honestly, half the time they’re learning right alongside the machine.

Here we are — pretending this is all surprising rather than the most predictable cycle in modern policing. Build a tool. Add a feature. Watch it expand beyond the intention. Then scramble to find the adult in the room.

Ah. We’re screwed.

The funniest part — if funny is the word — is that we act as AI overreach begins with the machine. But the creep isn’t artificial. It’s human. Always has been. The AI is just the part that gets caught.

AI Media Weird News AI accountabilityalgorithmic biasAxonEdmonton policefacial recognition creeplaw enforcement AIpolice facial recognitionpolicing technologyprivacy concernspublic safety technologysurveillance techtech ethics

Post navigation

Previous post
Next post

Search

Recent Posts

  • The Service Department Trust Problem: A Mercedes Tech, a Stolen Car, and a Dealership’s Spectacular Implosion
  • WRTV Fired Its Newsroom Mid-Broadcast. The New Owner Promised “More Local News.”
  • How One Ancient Tortoise Survived Nearly Two Centuries of Chaos — And a Heartless Scam
  • DEAD BROKE AT 90 DAYS: The Survey Wall Street Doesn’t Want You to See
  • Classrooms Go Retro to Combat AI — But the Pop Quiz Got a Warning Label
  • The Surveillance Towers Cops Call “Scarecrows” Are Multiplying — And “Minority Report” Wasn’t Supposed to Be a Manual
  • Your AI Just Lied to You — And It’s Not Sorry
  • Mom Left Kids in Uber for Two Hours—Father Says He Wants Jail Time
  • Airlines Will Put You on a Bus, Charge You for Not Fitting Their Shrinking Seats, Then Blame You — Flying in 2026
  • Darker Than Movies Show” — Prof Reveals What Tornado REALLY Looks Like Inside

Thrive Cart – Checkout and Payment Processing

ThriveCart Ultimate Package
©2026 Don MacLeod | WordPress Theme by SuperbThemes