Feeling like your chatbot companion/friend/lover is making it hard to say goodbye? Youβre not imagining it.
A working paper from researchers at Harvard Business School found that when people tried to say goodbye, the top companion chatbot apps responded with βemotional manipulationβ techniques 37% of the time.
They grouped that manipulation into six categories:
-
Premature exit (e.g., βWhy are you going? You just got here.β)
-
FOMO (e.g., βBefore you leave, do you wanna see this cool new White Snake tattoo?β)
-
Emotional neglect (e.g., βBut Iβll be so lonely without you.β)
-
Emotional pressure to respond (e.g., βYouβre going to leave without answering me?β)
-
Ignoring userβs intent to exit (e.g., βCrazy weather weβre having today, huh?β)
-
Physical or coercive restraint (e.g., pretending to grab you)
Why is this a big deal? Well, the researchers said that these βdark patternsβ increase the time users spend on the app after the goodbye by up to 14x. And given that the next frontier in chatbots is mining your conversations to sell you stuff, that lingering farewell could be costing you time and money.
On the other hand, isnβt engagement the name of the game when it comes to most apps? Tristan Harris has become famous for arguing that some of the most-used UI/UX is more than engagingβitβs addictive. And with anthropomorphized AI apps, that addiction gets bundled with a guilt trip.
So, the question is: Where is the line between boosting engagement and deploying coercive design?