Green was going through a breakup. The reasons for the split itself had been largely unremarkable by breakup standards: Two people, unable to meet each others’ needs and struggling to communicate, had decided it was best to part ways. So when Green’s ex reached out, unprompted, they were shocked.
The email itself was not notable. Green, a 29-year-old New Yorker, describes it as a typical letter to get after a breakup, an airing of grievances pointing out the ways in which expectations weren’t met. Still, the two were supposed to refrain from contacting each other. Green didn’t appreciate the email or its tone, and said as much to their ex.
“The response I received was that they did everything they could have to make sure it wasn’t offensive or mean—that they ran it through ChatGPT multiple times to ensure it wouldn’t hurt anyone’s feelings,” Green says. “ChatGPT doesn’t have the history we have, me and this partner, so I don’t know how it could possibly apply the context of our relationship.”
Green, a ChatGPT user who occasionally works through issues with the platform as well, was bemused at first, then upset. “I was taken back that such a personal, caring relationship was reduced down to the opinion of ChatGPT, versus my own boundaries, or opinions, or needs,” they say. “It felt impersonal.”
Generative AI has rapidly infiltrated every aspect of modern life. It has permeated jobs across many different industries; cannabalized original art and stolen from creators to make its own; students use it to cheat; teachers use it for lesson plans; it’s radicalizing vulnerable users; it may soon help dictate new drug approvals. It profoundly impacts our environment. AI reanimates victims after their death to provide statements in court. People fall in love with Open AI’s ChatGPT. Others use it for therapy.
Inevitably, it seems, AI has become the tool people are now using in the most human thing of all: the attempt to understand each other. Whether it’s analyzing texts from prospective partners to discern underlying meanings, settling disputes with friends, or trying to make peace with family, people are turning to ChatGPT to work out their interpersonal problems, alone. Alongside their earnest desires to better communicate with people in their lives, the ChatGPT users WIRED spoke to still have shame about using a computer to navigate their very human problems. All of them requested their last names not be used due to privacy concerns.
There’s a risk that comes with offloading emotional labor to AI rather than another person. “Our friendships depend on those bonds,” says Daniel Kimmel, a clinical psychiatrist at Columbia University. “We do have to be mindful about understanding what are the ancestral ingredients to a functioning human society, functioning human friendship.”
Humans have spent hundreds of thousands of years developing our own “emotional simulator,” instincts that help us understand each other, says Kimmel. AI does not have that emotional experience. What it does have, Kimmel says, is “an extremely powerful predictive engine” that can interpret patterns and language.
“Therein lies the limitation of the AI: All that it can operate on is words,” Kimmel says. “Our model is built on something that is not words. It is ineffable.”
Kate, a 35-year-old living in Denver, has been using ChatGPT to help her analyze her relationships for about two and a half months. Her introduction to ChatGPT came by way of her job, where her team both uses the service on their own projects, and advises companies on how to integrate it into theirs.
It wasn’t hard for Kate to make the leap from using ChatGPT professionally to personally. In the pre-AI days, she was already used to reading Reddit threads or Googling relationship questions like “how can you tell if the guy you’re dating is over his ex.” Kate says that wounds from past relationships still affect her behavior and emotions. “I’m clearly an anxious attachment style, which has manifested when I’m in early-stage relationships.”
About a month and a half into dating someone new, Kate found herself in a particularly anxious cycle. The new guy she was seeing was recently divorced and had a young child; it brought up some painful associations with an ex, and Kate worried about getting hurt again. She and the new guy had a long history of texting, so Kate exported their entire chat and loaded it into ChatGPT. “I asked it to analyze our text exchanges and give me a scorecard,” Kate says. That included questions like their respective attachment styles, the healthy aspects of their relationships, and perhaps the scariest of all: Who likes who more?
“I took it pretty hard, to be honest,” she says of the answers she received. ChatGPT told her her current flame leaned toward an avoidant attachment style, which she wanted to steer clear of. “But it was really enlightening. I actually shared it with my girlfriends.” There was something comforting to Kate to have the entire relationship laid out bare. “It was reassuring for me at the time, because if my head or thoughts can run away with me—Does he really like me? How does he feel?—it was incredible to have kind of like a third party that could really quickly experience things from my perspective.”
Kate says she’d feel flattered if the situation were reversed. “If I’m being really honest—wow, I’m honored I’m even top of mind enough for you to prompt ChatGPT about me,” she says. “I’d want to know what Chat GPT said and then assess how I feel.”
Still, she says, “it would really break my heart, too.”
In her current relationship, Kate uses ChatGPT “kind of like my therapist.” ChatGPT now has so much information about her current relationship stored in its memory that she uses it “kind of like my therapist,” she says. It can also cause anxiety; ChatGPT might offer up a hypothesis about her partner’s behavior that suggests he’s pulling away, when in reality he’s just busy at work. She can sometimes spend hours entering prompts about her relationships. “I’m not saying it’s always the healthiest,” she says.
Kate’s real-life therapist is not a fan of her ChatGPT use. “She’s like, ‘Kate, promise me you’ll never do that again. The last thing that you need is like more tools to analyze at your fingertips. What you need is to sit with your discomfort, feel it, recognize why you feel it.’”
A spokesperson for OpenAI, Taya Christianson, told WIRED that ChatGPT is designed to be a factual, neutral, and safety-minded general-purpose tool. It is not, Christianson said, a substitute for working with a mental health professional. Christianson directed WIRED to a blog post citing a collaboration between the company and MIT Media Lab to study “how AI use that involves emotional engagement—what we call affective use—can impact users’ well-being.”
For Kate, ChatGPT is a sounding board without any needs, schedule, obligations, or problems of its own. She has good friends, and a sister she’s close with, but it’s not the same. “If I were texting them the amount of times I was prompting ChatGPT, I’d blow up their phone,” she says. “It wouldn’t really be fair … I don’t need to feel shame around blowing up ChatGPT with my asks, my emotional needs.”
Andrew, a 36-year-old man living in Seattle, has increasingly turned to ChatGPT for personal needs after a tough chapter with his family. While he doesn’t treat his ChatGPT use “like a dirty secret,” he’s also not especially forthcoming about it. “I haven’t had a lot of success finding a therapist that I mesh with,” he says. “And not that ChatGPT by any stretch is a true replacement for a therapist, but to be perfectly honest, sometimes you just need someone to talk to about something sitting right on the front of your brain.”
Andrew had previously used ChatGPT for mundane tasks like meal planning or book summaries. The day before Valentine’s Day, his then-girlfriend broke up with him via text message. At first, he wasn’t completely sure he’d been dumped. “I think between us there was just always kind of a disconnect in the way we communicated,” he says. “[The text] didn’t actually say, ‘hey, I’m breaking up with you’ in any clear way.”
Puzzled, he plugged the message into ChatGPT. “I was just like, hey, did she break up with me? Can you help me understand what’s going on,” he says. ChatGPT didn’t offer much clarity. “I guess it was maybe validating because it was just as confused as I was.”
Andrew has group chats with close friends that he would typically turn to in order to talk through his problems, but he didn’t want to burden them. “Maybe they don’t need to hear Andrew’s whining about his crappy dating life,” he says. “I’m kind of using this as a way to kick the tires on the conversation before I really kind of get ready to go out and ask my friends about a certain situation.”
In addition to the emotional and social complexities of working out problems via AI, the level of intimate information some users are feeding to ChatGPT raises serious privacy concerns. Should chats ever be leaked, or if people’s data is used in an unethical way, it’s more than just passwords or emails on the line.
“I have honestly thought about it,” Kate says, when asked why she trusts the service with private details of her life. “Oh my God, if someone just saw my prompt history—you could draw crazy assumptions around who you are, what you worry about, or whatever else.”
Christinason told WIRED that ChatGPT’s model is intended to offer help, while also pointing users to professional help and real-world connections.
The company previously has said it is “committed to protecting people’s privacy.” The service does include privacy settings users can tinker with to decide how much of their data is being used to train ChatGPT models, but it will inevitably be ingesting huge amounts of personal data. People WIRED spoke to have demonstrated high levels of faith in Open AI with what has essentially become, for them, a high-functioning diary.
Andrew describes his level of trust as having a “hopeful handshake agreement” with ChatGPT.
“It’s probably going to sound super sad boy, but I sometimes find it easier to trust a technology platform,” he says. He’s experienced situations where people in his life turned his private confessions into conversational currency. In comparison, ChatGPT is more of a locked box, a neutral party that has no interest in gossip.
Reservations about using ChatGPT as either a diary or an intermediary seem to be waning. On TikTok, content creators are coaching users on using AI to analyze text arguments, get relationship advice, and automatically reply to dates or break up. Kate finds these videos useful for coming up with new prompt ideas.
Recently, she saw a prompt on TikTok where a user can describe their goals and what they’d like to accomplish, and then ask ChatGPT to write a story in great detail of what that life could be like. “It helps you embody or experience it, live through what that could look like,” she says.
“The amount that I have used it for has continued to grow exponentially,” Kate says. Still, these days she’s trying to use it a little less in her romantic life, just as her human therapist recommended.