There’s a strange kind of zombie apocalypse happening online — not with brains, but with bylines.
Across the web, forgotten newspaper URLs are being snatched up, revived, and repurposed into glossy, AI-generated “news sites.” They look real. They sound real. They even cite real people. The only thing missing? Humans.
And yes, Google is helping them spread.
How a teacher in Iceland got cloned
Last year, Icelandic teacher María Hjálmtýsdóttir wrote a charming piece for The Guardian about her country’s 36-hour workweek experiment. It was full of the kind of human texture that AI still can’t fake — anecdotes about her husband spending his extra free time chatting with fellow pigeon keepers.
A few months later, her story started showing up all over the internet — rewritten, repackaged, and stripped of life.
“Iceland switched to a 4-day workweek — Gen Z was right all along,” announced Dixie Sun News, a website that once belonged to a college newspaper.
“Iceland embraced the 4-day workweek in 2019: six years later, Gen Z’s vision has been realized,” said the Carroll County Observer, a defunct Maryland outlet turned content mill.
Even WECB.fm, once associated with Emerson College radio, joined in — a fake version of the station now publishing AI knockoffs of her story.
These sites are part of a new content underground: AI-powered factories running on expired domains and automation tools—journalism’s ghost network.
The ghost in the newsroom
Here’s the trick.
When an old newspaper shuts down or a business forgets to renew its domain, digital opportunists swoop in. They buy the address, slap up a “news site,” and feed it content generated by tools like WP Auto or Gravity Write.
The Farmingdale Observer is a textbook case. For decades, it covered high school sports and local events in Long Island. When it merged with another local paper, the site went dormant. Then in 2024, it came back from the dead — only now it was pushing stories about military tech, furniture trends, and “must-try life hacks.”
Within weeks, those stories were being featured on Google Discover, right alongside legitimate news.
That’s when people started noticing.
Fake writers, fake news, real reach
If you click through Farmingdale Observer’s “staff,” you’ll meet names like Rose Dixon and Bob Rubila — busy reporters who don’t exist. “Dixon” has published hundreds of articles on everything from nuclear energy to IKEA lamps to coins in freezers.
One byline even seems to be a digital tribute to a real former journalist, Dave Gil de Rubio, who hasn’t worked at the paper in years.
It’s plagiarism by automation. And it’s everywhere.
The puppeteers
An Estonian marketing firm called Tremplin-Numerique appears to sit behind a network of these sites. When contacted, a representative named “Cyrielle” sent a sponsorship pitch deck offering paid articles across multiple fake outlets — including WECB.fm, Dog Magazine, and even international domains like Seneweb.
Prices ranged from $2 to $10,000, depending on traffic. For a bit extra, they’d write the content for you too.
In other words: AI clickbait, available wholesale.
How the machines took over
Tools like WP Auto and Gravity Write can scrape existing news, rewrite it, and publish it automatically. They handle the text, images, headlines, even the social media snippets.
It’s journalism without journalists — a conveyor belt of SEO-friendly content optimized for Google’s algorithm.
According to NewsGuard, nearly 1,300 AI-generated news sites are now operating in at least 16 languages. Some even use stolen photos and bios of real reporters.
It’s not illegal yet, but it’s definitely parasitic.
When Google’s algorithm meets the grifters
Google insists its spam policies ban expired domain abuse and deceptive publishing. But those rules mean little when these sites keep showing up in Google Discover and News feeds.
The reason is simple: the algorithm rewards engagement, not authenticity. If the AI version of a story generates more clicks than the human original, the system treats it as the “better” article.
That’s how a fake newsroom ends up outranking the reporter who wrote the real story.
Journalism’s new identity crisis
It’s one thing when AI writes product reviews or generates stock photos. It’s another when it hijacks local news brands that communities once trusted.
This wave of synthetic media doesn’t just mimic style — it steals credibility. The domain names themselves carry history. People still associate them with real reporting.
It’s the digital equivalent of someone buying your childhood home and turning it into a slot machine.
When propaganda slips through the cracks
The problem isn’t limited to plagiarism. Some of these AI-run sites are quietly amplifying foreign propaganda.
Publications like Glass Almanac and African in Space have been caught publishing glowing coverage of Chinese military technology, claiming the U.S. is on “high alert” and “losing the global sea power race.”
When automation meets geopolitics, the line between clickbait and covert influence starts to blur.
The human cost
The people who suffer most aren’t the legacy publishers or even Google — it’s the working journalists. The freelancers. The local reporters whose work gets scraped, rewritten, and reuploaded by machines that don’t sleep or demand fair pay.
This isn’t innovation. It’s erosion.
And the worst part? Readers can’t tell the difference anymore.
A teacher’s warning from Iceland
María Hjálmtýsdóttir, the school teacher, said she was initially flattered by the global attention her article received — until she realized none of it was real.
“You can’t trust anything,” she said.
As a high school teacher, she’s now gone old-school with her students: handwritten essays, oral exams, fewer screens. “We don’t trust anything they hand to us digitally anymore,” she told ProPublica.
That’s where we’re at — rebuilding trust with pens and paper, one test at a time.