At a loss for words? Gen Z is outsourcing the hard conversations to AI
2026-03-08 - 13:13
Around two am on a Monday, Emily received a text from a fellow student, Patrick, whom she had gone on a blind date with two days earlier. The pair are juniors at Yale University who were set up by mutual friends. They requested anonymity so CNN agreed to change their names to protect their privacy. “Hey Emily! I hope your half-marathon went well — I’m sure you crushed it,” Patrick wrote with a winky-face emoji. “Okay, bear with me here — I’m not the best at this kind of thing, but here goes.” In a six-paragraph-long text, Patrick said he would like to “hang out more — whether it’s just as friends or whatever it was we were this weekend.” He added that he wasn’t “looking for anything too serious right now.” At first, Emily didn’t think his reply was anything out of the ordinary. “It just seemed really proper, and I guess I knew that he was a really nice guy. So, I was just like, maybe this is just how he texts.” But after sharing his message with two friends, who put it through an artificial intelligence detector, she had her answer: “It was like, 99 percent AI.” She was right. Patrick admitted using ChatGPT to craft his text. He said he didn’t have much experience crafting a rejection message: “What do I do here? It’s the first time I had seen anyone since my high school girlfriend, which is why I was so nervous and wanted a second opinion.” “I tried to write my thoughts down, but I wasn’t sure how to format this in a way that’s not, like, really bad, so then I went to Chat,” he said. He gave ChatGPT the situation, his thoughts and emotions, and “Chat spit out a response.” Patrick is far from alone. Researchers say a growing number of young people are turning to AI to navigate social situations — drafting rejection texts, decoding mixed signals and scripting difficult conversations. Experts warn that this habit may be stunting emotional growth, leaving an already isolated generation who came of age during the pandemic even less prepared for the messiness of human connection. Patrick went back-and-forth with the chatbot and “tweaked certain lines here and there, but it was mostly copy and paste” from ChatGPT. “I added an emoji and tried to make it sound more human,” he said. “I felt better putting this out there because I wanted to be very clear and forthcoming. I didn’t want to be wishy-washy with it in case she took it the wrong way. I knew if I did it on my own, I would have been wishy-washy,” said Patrick, who considered his move like consulting an expert. Emily said she did not think the text was clear and it made his intentions more confusing. She couldn’t tell from the AI wording “if he wanted to be friends or what.” “My main intention was to be clear in how I was feeling and thinking about the situation,” Patrick said. “Looking back on it, that was pretty poor behavior on my part. I think sitting on it for so long was the reason I went to Chat.” “I think he was overthinking it,” Emily said. “You definitely don’t need to use AI; you’re an emotionally sane guy.” She described the interaction as weird but said many of her friends have also turned to artificial intelligence to draft texts to friends or partners, or to analyze social situations — sometimes pasting entire text chains into a chatbot to decipher what someone might be thinking. “The thought of my little brother using AI to break up with his girlfriend is concerning. Because right now he comes to me, but when’s the day he’s going to turn to AI instead?” She said she is worried that Gen Zers have trouble “confronting their own feelings.” Emily said she’s also concerned about her generation’s ability to socialize, and some experts agree. It’s called ‘social offloading’ Emily’s experience is part of a broader pattern that concerns researchers. Dr. Michael Robb, head of research at Common Sense Media, calls it “social offloading,” using AI to navigate interpersonal situations, and he said it isn’t limited to Generation Z. He has observed it among Gen Alpha (born between 2010 and 2024) and some millennials (born between 1981 and 1996) as well. One-third of teens already prefer AI companions over humans for serious conversations, according to a 2025 survey conducted by Common Sense Media, a nonprofit organization that helps families navigate age-appropriate media choices. “If you’re using AI to draft your messages to friends or romantic partners, you’re outsourcing the communicative act itself,” Robb said. The problem is twofold, he noted. It creates an “expectation mismatch” since the recipient is “responding to an AI-polished version of their friend and not the actual person.” Second, repeated use can erode users’ confidence in their own voices, preventing young adults from developing essential skills, such as reading social intent, inferring others’ emotions and tolerating ambiguity in social interactions. “It has implications for your sense of self, advocacy and identity formation,” which are central to social development, Robb said. “If every tricky or difficult text is mediated by the AI, it may instill the belief in users that their own words and instincts are never good enough.” Dr. Michelle DiBlasi, a psychiatrist and assistant professor at Tufts University School of Medicine, has observed the same trend. “I have seen young people, late teens, early 20s, using AI to socialize, and oftentimes they’re using it as a way to overcompensate for the fact that they don’t really know how to truly interact with others,” she said. “We’re social beings, and a lot of our feelings of self-worth and connection are really related to our interactions with others.” DiBlasi said that using AI in social interactions stunts emotional growth and can perpetuate feelings of loneliness and isolation. It can also limit people’s ability to pick up social cues, repair relationships and connect with others. The pandemic’s impact on connection Why is Gen Z struggling with socialization? Researchers point to a combination of digital culture and the pandemic. Russell Fulmer, an associate professor at Kansas State University who studies AI and behavioral sciences, said the two forces created the “perfect storm” for AI to be integrated into social interaction. Adolescence — roughly ages 10 to 19, according to the World Health Organization — is the critical window for developing confidence, a stable sense of identity and emotional regulation. If adolescents don’t fully develop their social skills during this time, people may be “more prone to lack confidence, more apt to escapism or avoidance and maybe there’s a lack of resiliency,” Fulmer said. DiBlasi said the pandemic hit Gen Z at a particularly vulnerable moment. “When it happened, they were in the stages where the frontal lobe of their brain was starting to form,” she said. Typically, that’s when adolescents learn to build relationships, pick up social cues and develop mentalization — “the ability to understand somebody else’s mental state or what they’re thinking and how they’re feeling.” DiBlasi said that this lack of interaction leads to “a deep sense of isolation, feeling like others don’t understand them, or that they don’t understand others,” which drives many toward AI for companionship. But Fulmer warns that chatbots can create a “loneliness loop,” offering an “appearance of connection” that ultimately feels unfulfilling and can deepen isolation. In the most serious cases, DiBlasi has seen patients experiencing suicidal thoughts turn to AI to help articulate what they’re feeling when they can’t find the words to tell others. “I think this can be really, really detrimental, because it’s important for people to express some of these emotions in a very honest way with family or friends, so that they can actually work through this in an authentic way,” she said. It’s not too late to change course Although some Gen Zers may have missed a prime window for developing social skills, DiBlasi emphasized that it is not too late for them to learn. She encourages people to reach out to friends and family rather than AI when they struggle to express difficult emotions. “These things are skills that, just like anything with practice, can actually improve,” DiBlasi said. “I understand that people are fearful or they may not want to say the wrong thing. But I really think it takes away any sort of understanding of what you’re actually truly feeling and takes away the connection and the repair that you need to make in these relationships.” Artificial intelligence is a poor substitute for the messiness of real human interaction, experts say, and that messiness is the point. “Relationships and conversations can be messy and probably should be messy, and that’s part of what makes you more socially competent in the long run,” Robb said. AI companions are “designed to be very validating and agreeable,” he noted, so their feedback doesn’t reflect the friction that’s part of how people respond in real relationships. AI users shouldn’t expect an objective read on social situations either, Fulmer added. “Social contexts are often not entirely objective,” he said. “They’re contextual, they’re relational, and therefore nuanced.” As confident as a chatbot may sound, he said, it’s searching for a through line in something that may not have one. For parents, Robb recommended watching for warning signs, including social withdrawal, declining grades or a growing preference for AI over human interaction. They can respond with low-pressure check-ins, such as asking what their children use AI for, how it makes them feel and what they think they get out of it. The goal is to get kids thinking critically about what AI does well and where it falls short, said Robb, who suggested that families consider limits to AI-usage similar to screen time rules.