AI is transforming the way we communicate in love

by akwaibomtalent@gmail.com

Earlier this year, Mara*, a 32-year-old based in Berlin, met the end of her a-few-month-old situationship through an ‘extremely long’ breakup text. It was impersonal but not unusual in today’s casual dating climate. She didn’t think much of it – until she realised he’d used ChatGPT to draft it. “I only found out about it weeks later after a chat with a friend,” Mara says. “She was telling me that a friend of hers texted with him a while ago and stopped doing so because she realised he used ChatGPT for the most basic replies, and this is when it hit me.” When she revisited his message, Mara put together the pieces: the infamous ChatGPT em dashes, his all-of-a-sudden perfect grammar and even the structure. “It now made sense that his words didn’t match his actions,” she says.For a hot second, I felt like Joaquin Phoenix in Her.” 

When people talk about AI and dating, there’s often a focus on the image of a person falling in love with the machine. This, of course, does happen, but it’s also not the only way that platforms like ChatGPT have entered into our love lives. You may think you could spot an AI-generated image on a dating app (yes, that’s a thing) from a mile away. Still, AI-generated messages and, more so, AI-generated advice can easily enter a relationship through the veil of a suggested conversation starter. For Mara, receiving an AI breakup text was a signal of disrespect. “I’m pretty sure that I will look out for signs in the near future, like those goddamn dashes, but I think most likely I will just joke about it like, ‘Did you just let ChatGPT write this?’” she says. So, is AI changing how we communicate with one another in the face of love (or simply a situationship)?

On the other side of a ChatGPT-written breakup text may be a confused and conflicted person using the platform to make that decision. This was Lauren*, 32, in her previous year-and-a-half-long relationship, which ended in April. “I personally use ChatGPT as a therapist, so when we’d have a fight, I’d use ChatGPT to assess what had happened,” she says. “It got to a point where we were fighting every weekend, and anytime I just mentioned that I wasn’t feeling sure about the relationship, it would always offer, ‘Do you want me to craft a breakup letter?” Lauren started to notice that ChatGPT was nudging her to break up with him, even when she didn’t directly state that she wanted to do so. “It is always positioning you as in the right and how you feel as correct, like you are the champion, you are the saviour,” says Lauren. The only thing was, her ex-boyfriend also uses ChatGPT for advice, and it was doing the same to him.

Before their split, Lauren says she went through her ex-boyfriend’s phone. Instead of looking at text messages or photos, she went straight to ChatGPT. There, she found the AI platform was telling him relationship advice that went along the lines of “you’ve got this mission, you need to stick with it, and she’s not aligned with what you are building” and “she’s a distraction”. Even while ChatGPT was actively encouraging them both to break up with one another, there was a moment after it ended where they considered trying again. “We considered getting back together, but he sent me a list of ten discussion points that he says ChatGPT came up with based on inputs from his side,” she says. “All I wanted was to feel seen, heard and understood by him, but instead, he was sending me a robot’s questions.” The impersonal nature of it all closed the chapter for good.

ChatGPT’s latest update is overly sycophantic – it prioritises and placates you. This makes it easy and enjoyable to talk to (who doesn’t like to be complimented?), but, in the context of relationship advice, this means it always puts you at the centre of your own romantic universe. We all do this to ourselves, to some extent, but excellent relationship advice (from an actual person) will often cut through our tendency to be self-centred. ChatGPT will not do that, and we also know that AI regularly hallucinates incorrect information. “AI is trained to ‘mirror’ and is not trained to challenge its users,” says queer sex educator Gabrielle Kassel. “If we become so accustomed to the largely positive, affirming experience of ChatGPT, does that expectation show up with reduced tolerance for differences between us and our partners?”

If we become so accustomed to the largely positive, affirming experience of ChatGPT, does that expectation show up with reduced tolerance for differences between us and our partners?

Sometimes, validation is the draw. Emma, a 23-year-old in Oxford, says she often uses ChatGPT for relationship advice because it agrees with her. “When I’d speak with friends, they would be so brutal, saying ‘cut him off’, but when you are in the headspace of loving someone and knowing they aren’t right for you, that’s not really what you want to hear,” she says. “You want someone to say ‘there there, it will all work out’.” Mikey Moran, a 21-year-old in LA, began using AI habitually in school as a search engine. Over the years, he has come to use ChatGPT to gauge how a romantic conversation is going. “I sometimes second-guess myself and think, ‘Am I being too much?’, so I’ll just ask AI for its opinion,” he says. “It can be enlightening to ask questions you would otherwise never know where to get an answer from.”

Kassel says she is a firm believer in one of the best parts of dating being talking to your friends and community about it afterwards. With this in mind, she’s concerned that outsourcing this opportunity for human connection could lead to increased loneliness. It’s something we’re already seeing with regular ChatGPT users: according to a MIT study, higher daily usage of ChatGPT “correlated with higher loneliness, dependence, and problematic use, and lower socialisation”. Human-AI interaction expert Julie Carpenter says AI has become a new social category in people’s lives. “I think people are giving this new social category new authority,” she says. “When a friend says, ‘You did the right thing’, you might still doubt it and talk to your partner, if AI says it, people attribute way more intelligence to that than they should.”

Those who view ChatGPT as an authoritative voice often do so because they believe it’s impartial, although bias can occur in various stages of the AI pipeline. Des, a 25-year-old in Los Angeles, believes that ChatGPT has never steered her wrong. “In love, we often have rose-coloured glasses on and let things slide, but ChatGPT is unbiased,” she says. “Whether it’s ghosting or losing interest and not knowing what to say, ChatGPT is able to curate a personalised message when you don’t have the words or know how to dissect the ones given to you.” According to Carpenter, the AI world currently has an accountability problem. “These companies can’t have it both ways: they can’t present authority and wisdom that it doesn’t have without being accountable, because who’s accountable when it gives you misinformation?” she says. In America, Trump is opening the door for companies to develop the technology completely unfettered from oversight and safeguards

There is plenty of potential for positives in the AI space, including bridging language barriers and aiding in relationship communication. Arya, for example, is an AI-enabled couples concierge service that utilises an LLM to help couples communicate with one another. Berta, a 30-year-old in Brooklyn, says ChatGPT helped her navigate her way out of a narcissistic relationship, alongside a supportive network of friends. “I was able to see the situation more clearly,” she says. “Where maybe I would have excused certain behaviour in the past, Chat helped me connect the common threads, reflect more, and feel more affirmed in what I was feeling.” Berta feels like ChatGPT is a ‘non-judgemental space’, but she’s also wary of its ability to make you feel like you need to “check Chat first” for everything.

There’s also the very real possibility that AI platforms are also providing a space for men to actually talk about their relationship issues, away from the often nondescript and nonchalant structure of male friendships. Quadir Moore, a 23-year-old in New York, says he started using ChatGPT when he began digging deeper into his mental health and how having BPD is impacting his life. In his romantic life, he turns to ChatGPT when he’s feeling confused or upset about something emotionally. “It honestly has made communicating my needs so much easier – mainly by helping me figure out what those needs even were and why they were surfacing at specific times,” he says. “We already let tech guide so much of how we meet and connect, why not let it help us process and communicate too?”

Love is and has always been a complicated matter – a topic that’s filled many a journal and inspired countless songs, paintings and poems. It makes sense that it’s a topic people almost can’t help but speak to new AI platforms about. It would be romantic if it weren’t ultimately encouraging self-preservation at all costs, and potentially hallucinating sensitive information. The desire to be seen and heard is strong, and if we’re not getting that from each other, people will turn to a blank text box and an agreeable listening ear (even if it is AI-generated). But perhaps AI’s real impact on dating lies in how it changes our relationship with ourselves – weakening our critical thinking and placing our trust in tech giants, who quietly collect our most intimate data.

*Names are anonymous

You may also like

Leave a Comment