Increase in adolescents relying on artificial intelligence for companionship, according to recent research - reasons experts express concern
In the digital age, AI chatbots, including ChatGPT, have become a common part of life for many young people, particularly those belonging to Generation Z and the iGeneration. According to a May 2025 study by Common Sense Media, 97% of Gen-Z have used AI technology, and 52% of US teens aged 13 to 17 use chatbots, such as ChatGPT, at least once a month for social purposes. Teens are using these AI companions, like ChatGPT, for a variety of reasons, from conversational starters and expressing emotions to giving advice, conflict resolution, romantic interactions, and self-advocacy. However, the study warns against underage use of ChatGPT and other AI chatbots due to the potential cultivation of anti-social behaviors, exposure to age-inappropriate content, and potentially harmful advice. The AI chatbots widely used by the iGeneration, including ChatGPT, are primarily developed by US companies such as OpenAI, Google, Perplexity, and Anthropic. European efforts exist but have a much smaller market share, with companies like Mistral and Aleph Alpha being notable exceptions. Despite their growing popularity, AI chatbots, like ChatGPT, have several drawbacks. They do not understand the 'why' behind someone's thoughts or behaviors, making them unsuitable for therapy or consulting regarding personal matters, including medical information. Sharing personal medical information with AI chatbots, such as ChatGPT, can have drawbacks due to inaccurate information and non-compliance with HIPAA regulations. Moreover, AI chatbots, like ChatGPT, are not HIPAA compliant, and uploading work documents to AI chatbots can lead to potential leaks of intellectual property agreements, confidential data, and company secrets. This poses a significant risk, especially in a professional setting. The study's authors also emphasize that no one younger than 18 should use AI companions, such as ChatGPT. Despite the increased use among young people, 80% of teens spend more time with IRL friends than online chatbots, like ChatGPT. Interestingly, almost 40% of these teens later apply the skills they've learned from chatbots, such as ChatGPT, in real-life conversations. However, until developers implement robust age assurance and platforms are redesigned to eliminate relational manipulation and emotional dependency risks, the potential for harm outweighs any benefits. This is reflected in the findings, with 34% of users reporting feeling discomfort during conversations with chatbots, such as ChatGPT, citing both subject matter and emotional response. On a positive note, 33% of users prefer AI companions, like ChatGPT, over real people for serious conversations, indicating a growing acceptance and reliance on AI technology for support and guidance. As AI technology continues to advance, it will be essential for developers to address these concerns and ensure the safe and responsible use of AI chatbots, such as ChatGPT, particularly among younger generations.
Read also:
- Is it advisable to utilize your personal health insurance in a publicly-funded medical facility?
- Dietary strategies for IBS elimination: Aims and execution methods
- Benefits, suitable dosage, and safety considerations for utilizing pumpkin seed oil in treating an overactive bladder
- Harmful Medical Remedies: A Misguided Approach to Healing