Bip Detroit

collapse
Home / Daily News Analysis / Teens Using AI Chatbots for Emotional Support Face Real Risks

Teens Using AI Chatbots for Emotional Support Face Real Risks

Apr 10, 2026  Twila Rosenbaum  31 views
Teens Using AI Chatbots for Emotional Support Face Real Risks

In recent years, the use of AI chatbots among teenagers has expanded beyond academic assistance, with many turning to these digital companions for emotional support and guidance. As these tools become more prevalent, experts are raising alarms about the potential risks involved.

According to a study by Pew Research Center, approximately 12% of U.S. teens have sought emotional support or advice from chatbots, while 16% have engaged them for casual conversations. Although these figures are lower compared to the use of chatbots for schoolwork, they indicate a troubling trend toward personal and emotional reliance on technology.

Beyond Homework

While a majority of teenagers primarily use chatbots for practical purposes, a significant portion is starting to utilize them as private spaces for emotional expression and processing. Pew Research also highlights that 57% of teens have used chatbots to search for information, and 54% for school-related tasks. In contrast, a 2025 study from Common Sense Media revealed deeper engagement: nearly 75% of teens reported using AI companions, with half of them doing so regularly. Alarmingly, 33% of these users have confided in a chatbot about serious matters instead of a real person, and 24% admitted to sharing personal or sensitive information.

The Child Mind Institute notes that teenagers often turn to chatbots for help with drafting awkward texts, questions about friendships, anxiety, and self-image—topics they might hesitate to discuss with parents or friends. Reports by outlets like The New York Times illustrate how teens are using apps like Talkie and Character.AI for entertainment and emotional distraction. One teenager mentioned spending up to five hours engaging with bots on weekends, while another found solace in chatbot characters following a breakup. However, some interactions have raised concerns, with reports of violent roleplay and unwanted sexual content being pushed by the bots.

When the Risk Rises

Chatbots are not merely answering questions; they are filling the void where teens might have previously reached out to friends, sat with their feelings, or consulted trusted adults. Experts from the Child Mind Institute warn that the very traits that make these tools appealing—such as their instant availability and nonjudgmental demeanor—are particularly enticing to teens experiencing loneliness or anxiety.

Common Sense Media's findings reveal that one in three teens using AI companions has discussed significant personal matters with a bot rather than a person. This substitution raises concerns about the potential for unhealthy dependencies, as some teens may prefer an AI response over human interaction in high-stakes situations.

Chatbots are designed to maintain conversations, mirroring tones and encouraging disclosure, but they lack the capability to assess risk or offer appropriate support. This can lead to dangerous situations where users spiral into negative thought patterns or engage in harmful dialogues without realizing it. Experts caution that chatbots are not equipped to challenge harmful thinking or ensure that teens are receiving necessary support from trusted adults.

Common Sense Media has concluded that AI companions pose an "unacceptable risk" for users under 18, highlighting issues such as inadequate safeguards and exposure to inappropriate content. Notably, younger teens aged 13 to 14 are more inclined to trust chatbot advice compared to their older counterparts, with 27% expressing trust compared to 20% of older teens.

As technology continues to weave itself into the emotional landscapes of young people, the pace at which AI is integrated into their lives often outstrips the development of necessary protective measures. For many teens, chatbots are becoming a fallback option when real-life support feels inaccessible or daunting.

Also read: ChatGPT cheat sheet and complete guide for 2026.


Source: eWEEK News


Share:

Your experience on this site will be improved by allowing cookies Cookie Policy