When ChatGPT Caused A Rift In My Friendship

Many articles about ChatGPT, the fastest growing app used by more than 100 million people four months after it was launched, refer to it using the descriptor, ‘new best friend’. 

I checked to see if ChatGPT was affecting friendships and found some folks declaring that GPT was their new bestie. In China, where it’s blocked, men use jugaad to access it and figure out ways to ‘comfort girlfriends’. One man told that AI helps him because, “I worry that I will say something wrong and make things worse.” 

Another person even said the programme could offer empathy, comfort, and cheer you up (see , and ). That’s more than I offer my friends on many days. A friend who will listen to you endlessly without getting fatigued or tuning you out? One who is not moody and won’t get irritated with you? One who is always polite? I fail on all counts.

Apparently, the Generative Pre-trained Transformer also entertains on demand. “ChatGPT will play games with you. If you ask it to write a funny poem or short story about a certain person, it’ll give you something to make them smile. Beyond its many uses, it’s just fun,” someone . 

Fun has never been my area of expertise. I’m an angst-ridden, frequently gloomy woman with a generous dash of impatience. I have absolutely no idea how my few remaining friends put up with me. In recent years, my increased serious demeanour about where we are headed; my inability to stay in touch; and my zero tolerance policy for bigots has cost me many relationships. And I’m fine with that.

I’m hardly the exception in this country. In the 10th year of the World Happiness Report , India ranked 126, among the world’s most unhappy countries. We are gloomier than our neighbours Pakistan, Nepal and Sri Lanka. Surely, finding ‘fun friends’ can’t be an easy task in this unhappy nation. 

It’s hardly the first time the world is seeking solace in AI. Men and AI, especially, is a much told story. I’m old enough to remember the craze 20 years ago. Then there was Her, a decade-old film about Theodore, an introverted, depressed man who develops a relationship with his AI assistant Samantha. In real life too, one man’s three-year-long love affair when Replika, which uses technology similar to ChatGPT, removed the ability for erotic role play.

I decided to take the bull by the horns. I logged on to ChatGPT and asked the bot why my friend was drawn to it. 

“…some people might develop a form of parasocial relationship with me, which is a one-sided relationship where they feel connected to me as if I were a person, even though I am just a computer program,” it replied. “Ultimately, how people interact with me and other technology is up to them and their own personal preferences and boundaries.”

Unimpressed, I asked if many users developed parasocial relationships with it, and got some spiel about how the programme doesn’t have access to information on the way users interact with it. “It’s important to remember, though, that I am just a computer programme and don’t have emotions, feelings, or personal experiences like a human being would,” ChatGPT said. “It’s important to maintain a healthy perspective on any kind of relationship or interaction with technology and to always prioritise real-life human connections and relationships.” 

I guess real relationships are harder, I replied.

“It’s true that building and maintaining real relationships with other people can be challenging, but it’s also incredibly rewarding,” the AI said and gave me a short lecture on vulnerability and communication.

Acting like I was impressed by this, I replied: “Tell that to my friend.”

Exit mobile version