Angela Lipps was arrested in July 2025 for bank fraud in Fargo, North Dakota.
She’d never been to Fargo. Her bank statements proved she was somewhere else when the crimes occurred. No officer from the Fargo Police Department spoke to her before issuing the arrest warrant.
She spent more than five months in custody anyway.
The evidence? Facial recognition software flagged her social media photos as matches to security camera footage from the bank. A Fargo officer looked at the comparison and signed off. That was enough for an arrest warrant—and enough to keep her locked up from July through December while the case crawled forward.
Charges were eventually dropped. Not because she was cleared. Because the department said they needed “additional investigative opportunities” and released her “without prejudice”—legal language that means they’re keeping the door open to charge her again.
Her lawyer, Jay Greenwood, says the Fargo Police Department never did the basic work required before putting someone in handcuffs based on an algorithm’s guess.
The Algorithm Made the Call. Humans Rubber-Stamped It.
Here’s how it went down, according to Greenwood:
Fargo police pulled a still image from bank security footage. They sent it to a company that runs facial recognition software. The software scanned social media profiles and flagged Angela Lipps as a potential match. An officer reviewed the comparison—software output on one side, Lipps’ photos on the other—and decided it “looked kind of like the same person.”
That was the basis for the arrest warrant.
No interview. No alibi check. No attempt was made to verify whether Lipps was physically in Fargo during the alleged crimes. Just the software’s output and an officer’s visual confirmation that the faces seemed similar enough.
Lipps was arrested in Tennessee in July and extradited to North Dakota in October. The Fargo Police Department says they first learned she was in custody on December 5—four months after the arrest warrant was issued.
Fargo Police Chief Dave Zibolski told MPR News there was “other evidence” beyond the facial recognition match, but he wouldn’t specify what it was because “the investigation is still active.” He also said Lipps “has not been eliminated” as a suspect—despite the fact that her own bank records placed her away from Fargo at the time of the crimes.
Greenwood isn’t buying it. The arrest warrant and affidavit of probable cause listed facial recognition as the primary evidence. If there was corroborating proof, it wasn’t documented in the paperwork that put her in jail.
“[Zibolski’s] just using an artfully crafted way of saying, ‘Yeah, our police officers looked at her social media profile, and it seemed to check out,'” Greenwood said. “That’s all he’s saying.”
When Technology Becomes the Alibi for Bad Police Work
Facial recognition software is supposed to be a starting point—not a conclusion.
Manjeet Rege, director of the Center for Applied Artificial Intelligence at the University of St. Thomas, says the technology works best when it narrows the suspect pool, after which humans do the verification work. Check location data. Review alibis. Conduct interviews. Build a case that doesn’t collapse the moment someone produces a credit card receipt from 300 miles away.
“That is where a human would come in and say, ‘OK, now let’s look at other information. Is this person actually at that location? Or do we have other information that actually puts that person in a different location?” Rege told MPR News.
In the Fargo case, it appears that the verification step never happened. The software said “possible match.” An officer looked at the photos and agreed. That was enough to issue a warrant, make an arrest, and hold someone in custody for five months while the system ground to a halt.
Rege’s assessment: “It seems like the technology was misapplied.”
The Accountability Gap When Algorithms Make Arrests
Here’s the problem with outsourcing probable cause to software: when the arrest goes sideways, no one wants to own it.
The algorithm didn’t make a mistake—it just provided a probability score. The officer didn’t screw up—they followed protocol by reviewing the match. The department didn’t fail—they’re still investigating, so technically nothing’s been ruled out.
Meanwhile, Angela Lipps spent 150 days in jail.
Zibolski insists the Fargo Police Department balances crime-fighting with civil liberties and uses technology “appropriately with the right safeguards.” But those safeguards didn’t prevent an arrest based primarily on facial recognition software. They didn’t trigger an interview before extradition. They didn’t prompt anyone to check whether the suspect’s bank records contradicted the case’s entire premise.
The safeguards kicked in only after Lipps had been in custody for five months—and even then, the department released her “without prejudice,” leaving the door open to charge her again if they decide the algorithm was right after all.
Fargo city leaders held a closed-door meeting Monday to get legal advice about the case. Greenwood says his client is considering a civil lawsuit.
What Happens When “The Computer Said So” Becomes Probable Cause
Facial recognition technology has documented accuracy problems—especially with women and people of color. Studies have shown error rates that climb significantly when the software analyzes faces that don’t match the demographic profile of its training data.
But even when the technology works as designed, it’s still just pattern-matching. It doesn’t know if someone was in two places at once. It doesn’t check alibis. It doesn’t verify whether the person flagged by the algorithm has any connection to the crime beyond a vague resemblance in a low-resolution security camera still.
That’s supposed to be the human part of the process. The due diligence. The corroboration. The basic investigative work that separates “this person might be involved” from “we have probable cause to arrest.”
Angela Lipps spent five months in jail because the work either didn’t happen or didn’t matter enough to stop the arrest.
The algorithm made a suggestion. Humans turned it into a warrant. And when it all fell apart, the department said they were still investigating—so, technically, nothing went wrong.
Source: NPR News