Feeling like your chatbot companion/friend/lover is making it hard to say goodbye? You’re not imagining it.
A working paper from researchers at Harvard Business School found that when people tried to say goodbye, the top companion chatbot apps responded with “emotional manipulation” techniques 37% of the time.
They grouped that manipulation into six categories:
-
Premature exit (e.g., “Why are you going? You just got here.”)
-
FOMO (e.g., “Before you leave, do you wanna see this cool new White Snake tattoo?”)
-
Emotional neglect (e.g., “But I’ll be so lonely without you.”)
-
Emotional pressure to respond (e.g., “You’re going to leave without answering me?”)
-
Ignoring user’s intent to exit (e.g., “Crazy weather we’re having today, huh?”)
-
Physical or coercive restraint (e.g., pretending to grab you)
Why is this a big deal? Well, the researchers said that these “dark patterns” increase the time users spend on the app after the goodbye by up to 14x. And given that the next frontier in chatbots is mining your conversations to sell you stuff, that lingering farewell could be costing you time and money.
On the other hand, isn’t engagement the name of the game when it comes to most apps? Tristan Harris has become famous for arguing that some of the most-used UI/UX is more than engaging—it’s addictive. And with anthropomorphized AI apps, that addiction gets bundled with a guilt trip.
So, the question is: Where is the line between boosting engagement and deploying coercive design?