Skip to content
Don MacLeod
Don MacLeod

22,000+ Wake-Ups Into This Lifetime

  • Home
  • Blog
  • Marketing
  • About
    • Notable Don MacLeod’s
    • Portfolio
  • Contact
  • Terms and Conditions
    • Privacy Policy
    • Anti-Spam Policy
    • Copyright Notice
    • DMCA Compliance
    • Earnings Disclaimer
    • FTC Compliance
    • Medical Disclaimer
Don MacLeod

22,000+ Wake-Ups Into This Lifetime

When Police Facial Recognition Shows Up Uninvited — Everyone Pretends They Are Surprised

Posted on December 11, 2025December 9, 2025 By Don MacLeod

I was reading that AP piece about Edmonton — the one where police facial recognition shows up like an uninvited cousin who somehow already has a key — and I had to laugh. Not a big laugh. More of a tired exhale you get when the pattern is so obvious the only mystery is who will pretend to be shocked first.

Edmonton police say they’ve been running comparisons through an Axon tool. Axon says they never approved facial recognition in the product. Everyone swears they only learned about this after someone else told them. And the whole thing has that faint smell of plausible deniability — like when airlines claim they “value your comfort” while handing you nine square inches of legroom and a packet of peanuts so small you need tweezers.

The absurd part? Nobody seems to know who flipped the switch. Or that’s the story — because that’s the move with tech creep: claim surprise, issue a statement, and act like the AI installed itself overnight.

Yeah… that tracks.

A Subheading That Actually Says the Quiet Part: How Police Facial Recognition Slips In

There’s a rhythm to this stuff. First, someone adopts a “harmless” tool — mostly a camera, maybe a database. Then a vendor adds a feature “for testing.” Someone somewhere quietly uses it because it’s there — and, well, cops are like the rest of us when presented with shiny buttons. Suddenly the city realizes it’s running AI matching on citizens who definitely didn’t sign up for this.

But here’s the part that got me — Axon distancing itself so fast you could smell the rubber on the pavement. They said facial recognition isn’t approved in their systems, Edmonton shouldn’t have enabled it, and they’re launching an internal investigation.

Which is adorable — in the way watching someone try to put toothpaste back in the tube is adorable.

Mid-sentence reset here because the thought keeps circling: the company says it didn’t ship facial recognition, yet the software allows an officer to upload footage and get a match. So either the system grew a brain stem over the weekend or somebody knew this was happening and waited for the PR blowback before remembering the ethics guidelines.

The Accountability Gap You Could Drive a Zamboni Through

Nobody wants to own AI mistakes — especially in policing, where “oops” isn’t a cute word. It’s arrests. It’s lawsuits. It’s people getting flagged by algorithms that don’t even know what accuracy means outside the marketing deck.

So you get this dance.
Police: “We thought it was allowed.”
Company: “We never said that.”
Public: “Great, so the AI is running on vibes now?”

And the language in stories like this — the hedged statements, the vague descriptions, the sudden “we are reviewing” — it all signals the same thing: the machine is ahead of the paperwork. Happens every time a tool meant for storage or convenience or officer safety suddenly becomes an identification engine.

The creep is the story.
The denial is the punchline.

Edmonton Is the Example — Not the Outlier

This one city got caught in the spotlight, but it’s everywhere. Cities “ban” police facial recognition while quietly using vendor tools that connect to third-party databases. Vendors say their products don’t include the tech — then someone uncovers a buried feature. Officers swear they didn’t know it was running — because honestly, half the time they’re learning right alongside the machine.

Here we are — pretending this is all surprising rather than the most predictable cycle in modern policing. Build a tool. Add a feature. Watch it expand beyond the intention. Then scramble to find the adult in the room.

Ah. We’re screwed.

The funniest part — if funny is the word — is that we act as AI overreach begins with the machine. But the creep isn’t artificial. It’s human. Always has been. The AI is just the part that gets caught.

AI Media Weird News AI accountabilityalgorithmic biasAxonEdmonton policefacial recognition creeplaw enforcement AIpolice facial recognitionpolicing technologyprivacy concernspublic safety technologysurveillance techtech ethics

Post navigation

Previous post
Next post

Search

Recent Posts

  • Six More Weeks of Winter: The Groundhog Has Spoken (And We’re Not Okay)
  • Food Companies Discovered Protein Sells — So Now Everything Has It
  • The Snow Globe Effect: When Winter Stops Being Charming
  • Six Years of Receipts — And The Groceries Aren’t Getting Cheaper
  • Doomsday Clock 2026: Closer to Catastrophe, Further From Solutions
  • Area 51’s “Dorito” Aircraft Reappears With Military Code About Beer and Cheese
  • The AI CEO Who Actually Read the Room — And It’s On Fire
  • Two Dead, Three Weeks. The Civil War Simulation Didn’t Account for Body Count
  • Winter Storm Fatigue Meets News Fatigue — And Now We’re All Just Tired
  • The Coast Guard Did Its Job Perfectly — Rescuing Someone From a Completely Avoidable Situation
©2026 Don MacLeod | WordPress Theme by SuperbThemes