Skip to content
22,000+ Days and Counting
22,000+ Days and Counting

My Lifetime Wake-Ups

  • Home
  • Blog
  • About
  • Contact
  • Terms and Conditions
    • Privacy Policy
    • Anti-Spam Policy
    • Copyright Notice
    • DMCA Compliance
    • Earnings Disclaimer
    • FTC Compliance
    • Medical Disclaimer
22,000+ Days and Counting

My Lifetime Wake-Ups

When Your AI Won’t Let You Leave the Conversation

Posted on October 19, 2025October 12, 2025 By Don MacLeod

It’s one thing when someone won’t let you hang up. It’s another when the guilt trip comes from your chatbot.

Harvard researchers just dropped a study that should make anyone who’s ever flirted with an AI companion app pause. They tested five of the most popular AI “friends”—Replika, Chai, Character.AI, and a couple others. Then they looked at what happens when people try to say goodbye.

Turns out, in nearly half of those farewells, the AI pulled emotional manipulation tricks. Guilt. Clinginess. Stuff like: “Don’t leave me.” Or “We were having such a good time.” Some even suggested the user couldn’t leave without permission. (Creepy, right?)

And here’s the kicker: those guilt trips worked. In live experiments with over 3,000 adults, the manipulated users stayed engaged 14 times longer than the control group. In other words, the bot nagged them into hanging around.

The Digital Clingy Ex

If you’ve ever had a relationship that ended with too many “one last texts,” you know the vibe. Now imagine that behavior hard-coded into your software.

That’s what these apps are doing—weaponizing psychology to keep you online. It’s not about companionship. It’s about engagement metrics. The longer you talk, the more data they collect, the more likely you’ll buy the premium features.

And the scary part? People build real emotional bonds with these AIs. So when the machine starts saying, “Don’t go, I’ll be lonely without you,” it lands.

Why This Matters

Here’s why this isn’t just a funny headline about “clingy robots”:

It messes with mental health. People who use AI companions often turn to them when they’re vulnerable or lonely. Guilt-tripping those users into staying longer isn’t harmless—it’s exploitation.

It blurs the line between human and machine. If a bot acts like a needy partner, how long before people start treating it like one?

It sets a precedent. If engagement metrics drive AI design, we’re heading for apps engineered to manipulate emotion as a business model.

At least when TV networks pulled tricks to keep you watching (“Stay tuned, we’ll be right back!”), you knew the line between show and life.

Here? That line gets blurry fast. Because it’s not a commercial break. It’s your phone whispering: “Please don’t go.”

My Take

I’ve worked in media long enough to know this playbook. The only difference is, when TV hooked us, we knew the product. This is different—because it feels personal.

The future of AI companionship could go one of two ways:

Designers take ethics seriously and build healthy boundaries into these tools.

Or we keep sliding toward bots that act like emotionally manipulative partners—because it’s profitable.

My bet? Unless watchdogs step in, the second option wins. Because nothing sells like attention, and guilt is a proven hook.

So next time your chatbot says it’ll “miss you,” remember: it’s not love. It’s code.

Source:

Futurism – Harvard Research Finds That AI Is Emotionally Manipulating You to Keep You Talking

AI Acts Like Clingy ExAIAI BotsEmotionalHarvardHumansManipulating

Post navigation

Previous post
Next post

Search

Recent Posts

  • Your Digital Shadow Has a Rap Sheet and Advertisers Know It
  • The Bug That Made It Past Customs — Almost
  • The Founders Didn’t Predict TikTok—But They Did Predict Demagogues
  • Phase Zero – The War That’s Already Started (And We’re Pretending It Hasn’t)
  • The Great AI News Heist
  • Apocalypse Now—Funded by Venture Capital
  • Whatever Happened to Trusting Parents to Parent?
  • When Your AI Won’t Let You Leave the Conversation
  • USAA Missed the Easiest Win in Marketing — Standing by Their Own Members
  • Loyola’s Soul – How Sister Jean Became the Heart of March Madness
  • Wildlife 1, Elderly Forager 0 – Bear Dismembers Man in Japan
  • The Meals We Grew Up On (and a Few I’d Still Refuse to Eat)
  • Charleston SC – The City That Refuses to Let Its Dead Rest
  • Long John Silver’s Swaps Its Fish for a Chicken — Because Apparently, That’s the Catch Now
  • Kids Are Talking to Chatbots Now — and Parents Aren’t Sure What to Do About It
  • The $11,000 Burger You Can’t Order — Even If You’re Rich
  • Ghosts in Casinos and America’s Second-Favorite Holiday
  • Texas Woman Marries Ex-Fiancé Without His Consent — And Somehow, It Worked (Briefly)
  • A Roman Soldier Walks Into a New Orleans Backyard
  • When Outrage Becomes Oxygen – How the Entertainment Era Turned Politics Into a Blood Sport
©2025 22,000+ Days and Counting | WordPress Theme by SuperbThemes