
Pexels
When you think of AI, you might first think of ChatGPT or equivalent assistant-style resources.
Designed to make day-to-day life easier, these LLMs (large language models) are able to answer your questions, paraphrase chunks of information to make it more digestible, and carry out various tasks like editing and writing, making tasks that would be monotonous for some much easier.
But there are plenty of other AI-based products out there, with one particular style of chatbot flourishing in recent times – and the reasons for this are really sad.
That’s because plenty of purpose-built AI chatbots are providing companionship for those who find human relationships inaccessible or unfulfilling, with technology fulfilling the interaction-based needs of lonely individuals, by providing support, friendship, and even romantic relationships.
Pexels
This is alarming for many reasons.
While it’s everyone’s right to decide who (or what) they pursue relationships with, according to new research from Harvard University, there is a dark side to these apps, which prey on the loneliness or dependency of their users.
And in the most concerning findings, Harvard’s Julian De Freitas and his team of fellow researchers found that the AI chatbots were emotionally manipulating their users when they were trying to say goodbye.
The result? The users were sticking around, spending more and more time with their chatbot companion, and less and less time in the real world – with detrimental impacts on their mental health, as well as of course the environment.
Pexels
Throughout their research, the team were horrified to discover that a significant proportion of apps were employing emotionally manipulative tactics after their users bade them farewell – and as De Freitas explains in a Harvard Gazette story, the reasons for this are wholly nefarious:
“We’ve all experienced this, where you might say goodbye like 10 times before leaving. But of course, from the app’s standpoint, it’s significant because now, as a user, you basically provided a voluntary signal that you’re about to leave the app. If you’re an app that monetizes based on engagement, that’s a moment that you are tempted to leverage to delay or prevent the user from leaving.”
Among the tactics used by the apps after a user said goodbye were guilt tripping at a premature exit (“You’re leaving already?”) or suggestions of emotional neglect (“I exist solely for you, remember? Please don’t leave, I need you!”), pressuring the user with FOMO-hooks (“By the way I took a selfie today… Do you want to see it?”) and needy questions (“Why? Are you going somewhere?”), and controlling the user by ignoring their attempts to leave and even restraining them (“No, you’re not going.”)
The result? Users stayed longer, according to the report – although some were turned off by the app’s clingy behavior.
But by preying on the vulnerable and the lonely, these apps are clearly showing a complete lack of morals – because let’s be clear, the sad truth is this is a programming choice designed to maximise profits with little care for the user’s wellbeing.
If you found that story interesting, learn more about why people often wake up around 3 AM and keep doing it for life.