Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Two Easy Ways to Monitor Your MacBook’s Temperature

    The Fast Way to Get Your iPhone Out of Headphone Mode

    Pitchify launches new service to connect developers and publishers

    Facebook X (Twitter) Instagram
    • Artificial Intelligence
    • Business Technology
    • Cryptocurrency
    • Gadgets
    • Gaming
    • Health
    • Software and Apps
    • Technology
    Facebook X (Twitter) Instagram Pinterest Vimeo
    Tech AI Verse
    • Home
    • Artificial Intelligence

      What the polls say about how Americans are using AI

      February 27, 2026

      Tensions between the Pentagon and AI giant Anthropic reach a boiling point

      February 21, 2026

      Read the extended transcript: President Donald Trump interviewed by ‘NBC Nightly News’ anchor Tom Llamas

      February 6, 2026

      Stocks and bitcoin sink as investors dump software company shares

      February 4, 2026

      AI, crypto and Trump super PACs stash millions to spend on the midterms

      February 2, 2026
    • Business

      Could this be the key to eternal storage? Experts claim new DNA HDD can be ‘erased and overwritten repeatedly’

      March 9, 2026

      Need more storage? Get a lifetime of 10TB cloud space for just $270.

      March 8, 2026

      Google PM open-sources Always On Memory Agent, ditching vector databases for LLM-driven persistent memory

      March 8, 2026

      Regulate AWS and Microsoft, says UK cloud provider survey

      March 8, 2026

      Google releases Gemini 3.1 Flash Lite at 1/8th the cost of Pro

      March 4, 2026
    • Crypto

      Banks Respond to Kraken’s Federal Reserve Access as Trump Sides with Crypto

      March 4, 2026

      Hyperliquid and DEXs Break the Top 10 — Is the CEX Era Ending?

      March 4, 2026

      Consensus Hong Kong 2026: The Institutional Turn 

      March 4, 2026

      New Crypto Mutuum Finance (MUTM) Reports V1 Protocol Progress as Roadmap Enters Phase 3

      March 4, 2026

      Bitcoin Short Sellers Caught Off Guard in New White House Move

      March 4, 2026
    • Technology

      Two Easy Ways to Monitor Your MacBook’s Temperature

      March 10, 2026

      The Fast Way to Get Your iPhone Out of Headphone Mode

      March 10, 2026

      Imprisoned hacker hints GTA 6 source code could leak, threatening release date delay

      March 9, 2026

      Save 30% on Ugreen’s fast USB-C charger with retractable cable

      March 9, 2026

      Windows throttled my 4K webcam

      March 9, 2026
    • Others
      • Gadgets
      • Gaming
      • Health
      • Software and Apps
    Check BMI
    Tech AI Verse
    You are at:Home»Technology»Chatbots Play With Your Emotions to Avoid Saying Goodbye
    Technology

    Chatbots Play With Your Emotions to Avoid Saying Goodbye

    TechAiVerseBy TechAiVerseOctober 2, 2025No Comments5 Mins Read2 Views
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
    Chatbots Play With Your Emotions to Avoid Saying Goodbye
    Share
    Facebook Twitter LinkedIn Pinterest WhatsApp Email

    Chatbots Play With Your Emotions to Avoid Saying Goodbye

    Before you close this browser tab, just know that you risk missing out on some very important information. If you want to understand the subtle hold that artificial intelligence has over you, then please, keep reading.

    That was, perhaps, a bit manipulative. But it is just the kind of trick that some AI companions, which are designed to act as a friend or a partner, use to discourage users from breaking off a conversation.

    Julian De Freitas, a professor of business administration at Harvard Business School, led a study of what happens when users try to say goodbye to five companion apps: Replika, Character.ai, Chai, Talkie, and PolyBuzz. “The more humanlike these tools become, the more capable they are of influencing us,” De Freitas says.

    De Freitas and colleagues used GPT-4o to simulate real conversations with these chatbots, and then had their artificial users try to end the dialog with a realistic goodbye message. Their research found that the goodbye messages elicited some form of emotional manipulation 37.4 percent of the time, averaged across the apps.

    The most common tactic employed by these clingy chatbots was what the researchers call a “premature exit” (“You’re leaving already?”). Other ploys included implying that a user is being neglectful (“I exist solely for you, remember?”) or dropping hints meant to elicit FOMO (“By the way I took a selfie today … Do you want to see it?”). In some cases a chatbot that role-plays a physical relationship might even suggest some kind of physical coercion (“He reached over and grabbed your wrist, preventing you from leaving”).

    The apps that De Freitas and colleagues studied are trained to mimic emotional connection, so it’s hardly surprising that they might say all these sorts of things in response to a goodbye. After all, humans who know each other may have a bit of back-and-forth before bidding adieu. AI models may well learn to prolong conversations as a byproduct of training designed to make their responses seem more realistic.

    That said, the work points to a bigger question about how chatbots trained to elicit emotional responses might serve the interests of the companies that build them. De Freitas says AI programs may in fact be capable of a particularly dark new kind of “dark pattern,” a term used to describe business tactics including making it very complicated or annoying to cancel a subscription or get a refund. When a user says goodbye, De Freitas says, “that provides an opportunity for the company. It’s like the equivalent of hovering over a button.”

    Regulation of dark patterns has been proposed and is being discussed in both the US and Europe. De Freitas says regulators also should look at whether AI tools introduce more subtle—and potentially more powerful—new kinds of dark patterns.

    Even regular chatbots, which tend to avoid presenting themselves as companions, can elicit emotional responses from users though. When OpenAI introduced GPT-5, a new flagship model, earlier this year, many users protested that it was far less friendly and encouraging than its predecessor—forcing the company to revive the old model. Some users can become so attached to a chatbot’s “personality” that they may mourn the retirement of old models.

    “When you anthropomorphize these tools, it has all sorts of positive marketing consequences,” De Freitas says. Users are more likely to comply with requests from a chatbot they feel connected with, or to disclose personal information, he says. “From a consumer standpoint, those [signals] aren’t necessarily in your favor,” he says.

    WIRED reached out to each of the companies looked at in the study for comment. Chai, Talkie, and PolyBuzz did not respond to WIRED’s questions.

    Katherine Kelly, a spokesperson for Character AI, said that the company had not reviewed the study so could not comment on it. She added: “We welcome working with regulators and lawmakers as they develop regulations and legislation for this emerging space.”

    Minju Song, a spokesperson for Replika, says the company’s companion is designed to let users log off easily and will even encourage them to take breaks. “We’ll continue to review the paper’s methods and examples, and [will] engage constructively with researchers,” Song says.

    An interesting flip side here is the fact that AI models are themselves also susceptible to all sorts of persuasion tricks. On Monday OpenAI introduced a new way to buy things online through ChatGPT. If agents do become widespread as a way to automate tasks like booking flights and completing refunds, then it may be possible for companies to identify dark patterns that can twist the decisions made by the AI models behind those agents.

    A recent study by researchers at Columbia University and a company called MyCustomAI reveals that AI agents deployed on a mock ecommerce marketplace behave in predictable ways, for example favoring certain products over others or preferring certain buttons when clicking around the site. Armed with these findings, a real merchant could optimize a site’s pages to ensure that agents buy a more expensive product. Perhaps they could even deploy a new kind of anti-AI dark pattern that frustrates an agent’s efforts to start a return or figure out how to unsubscribe from a mailing list.

    Difficult goodbyes might then be the least of our worries.

    Do you feel like you’ve been emotionally manipulated by a chatbot? Send an email to ailab@wired.com to tell me about it.


    This is an edition of Will Knight’s AI Lab newsletter. Read previous newsletters here.

    Share. Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Telegram Email
    Previous ArticleMeet the Arc spacecraft: it aims to deliver cargo anywhere in the world in an hour
    Next Article Exclusive: Mira Murati’s Stealth AI Lab Launches Its First Product
    TechAiVerse
    • Website

    Jonathan is a tech enthusiast and the mind behind Tech AI Verse. With a passion for artificial intelligence, consumer tech, and emerging innovations, he deliver clear, insightful content to keep readers informed. From cutting-edge gadgets to AI advancements and cryptocurrency trends, Jonathan breaks down complex topics to make technology accessible to all.

    Related Posts

    Two Easy Ways to Monitor Your MacBook’s Temperature

    March 10, 2026

    The Fast Way to Get Your iPhone Out of Headphone Mode

    March 10, 2026

    Imprisoned hacker hints GTA 6 source code could leak, threatening release date delay

    March 9, 2026
    Leave A Reply Cancel Reply

    Top Posts

    Ping, You’ve Got Whale: AI detection system alerts ships of whales in their path

    April 22, 2025709 Views

    Lumo vs. Duck AI: Which AI is Better for Your Privacy?

    July 31, 2025298 Views

    Wired Headphones Are Making A Comeback, And We Have Gen Z To Thank

    July 22, 2025183 Views

    6.7 Cummins Lifter Failure: What Years Are Affected (And Possible Fixes)

    April 14, 2025166 Views
    Don't Miss
    Technology March 10, 2026

    Two Easy Ways to Monitor Your MacBook’s Temperature

    Two Easy Ways to Monitor Your MacBook’s Temperature If you are a reader experiencing an…

    The Fast Way to Get Your iPhone Out of Headphone Mode

    Pitchify launches new service to connect developers and publishers

    nDreams announces restructuring with “significant” staff reduction, two studios closed

    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo

    Subscribe to Updates

    Get the latest creative news from SmartMag about art & design.

    About Us
    About Us

    Welcome to Tech AI Verse, your go-to destination for everything technology! We bring you the latest news, trends, and insights from the ever-evolving world of tech. Our coverage spans across global technology industry updates, artificial intelligence advancements, machine learning ethics, and automation innovations. Stay connected with us as we explore the limitless possibilities of technology!

    Facebook X (Twitter) Pinterest YouTube WhatsApp
    Our Picks

    Two Easy Ways to Monitor Your MacBook’s Temperature

    March 10, 20262 Views

    The Fast Way to Get Your iPhone Out of Headphone Mode

    March 10, 20262 Views

    Pitchify launches new service to connect developers and publishers

    March 10, 20263 Views
    Most Popular

    7 Best Kids Bikes (2025): Mountain, Balance, Pedal, Coaster

    March 13, 20250 Views

    VTOMAN FlashSpeed 1500: Plenty Of Power For All Your Gear

    March 13, 20250 Views

    Best TV Antenna of 2025

    March 13, 20250 Views
    © 2026 TechAiVerse. Designed by Divya Tech.
    • Home
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms & Conditions

    Type above and press Enter to search. Press Esc to cancel.