Key Takeaways

  • AI addiction is on the rise as more teens rely on chatbots for companionship and support.

  • These tools are built to keep kids engaged, which can make it harder for them to disconnect and manage their emotions offline.

  • Parents can help by spotting signs of AI addiction early, setting clear limits, and talking openly about healthy tech habits.

From social media algorithms to autocorrect, most of us rely on artificial intelligence every day through our favorite apps, whether we realize it or not. But now, more than ever, the majority of teens are turning to responsive AI chatbots like ChatGPT—whether parents realize it or not.

Seven out 10 kids ages 13 to 18 use at least one type of generative AI tool, yet only 37% of their parents know about it. While most teens report using AI search engines for things like homework help and language translation, not all AI tools are created equal—nor are the risks associated with just how dependent kids are becoming on them.

“One way that we’ve seen an enormous increase in [AI] use is with AI companions, which are chatbots based on famous people or fictional characters,” explains Titania Jordan, Chief Parent Officer of online safety company Bark Technologies. “Kids can develop intense emotional relationships with these computer-generated text programs, as the chatbots always respond immediately and provide seemingly endless support.”

The dangers of building these addictive relationships with AI chatbots, which also include platforms like Replika, Character.ai, and Nomi, have already made national news. Just last month the parents of a 16-year-old boy sued OpenAI after discovering he had been turning to ChatGPT for mental health support, which they believed led to his suicide.

So how can you tell if your teen’s AI use has crossed the line into addiction? Here, experts break down what “AI addiction” looks like, how it affects kids, and the steps parents can take to protect them.

What Is ‘AI Addiction’ and Why Does It Matter?

The term “AI addiction” isn’t a formal diagnosis. Formally, addiction is a chronic medical condition. Instead, experts often use “problematic use” to describe unhealthy screen habits that mirror addiction-like symptoms, explains Yann Poncin, MD, child and adolescent psychiatrist at Yale School of Medicine.

AI addiction can look similar to problematic social media use, according to Poncin, which is a pattern of behavior that includes:

  • Inability to control time spent engaging with the app or platform

  • Experiencing withdrawal when restricting use

  • Neglecting other responsibilities in favor of spending time online

“AI design, much like social media design, is based on keeping users hooked—whether it’s a shiny red notification or an AI companion asking a kid new questions,” Jordan adds. “This element of interactivity becomes addictive, especially when it’s tied to making kids feel wanted, loved, or popular.”

So, why should parents be concerned? Simply put, AI platforms are not built with adolescent health and wellbeing in mind, explains Erin Walsh, author of It’s Their World: Teens, Screens, and the Science of Adolescence and co-founder of Spark & Stitch Institute. And yet, kids and teens are most likely to get addicted to using them.

Adolescence is marked by a growing desire for autonomy, privacy, and identity exploration,” Walsh says. “Given that developmental context, it’s no surprise that adolescents turn to AI to sort through their experiences in what feels like a private, affirming, and non-judgmental space.”

But instead of being designed to help kids and teens navigate real-life personal and social challenges, AI platforms prioritize engagement, attention, and time online. This means there is a mismatch between what is healthy for teens, which is encouraging self-directed technology use, and AI platform goals, which is to get users hooked with downright addicting features.


Adolescence is marked by a growing desire for autonomy, privacy, and identity exploration. Given that developmental context, it’s no surprise that adolescents turn to AI to sort through their experiences in what feels like a private, affirming, and non-judgmental space.”

Erin Walsh

These are the most problematic AI design features that can make it nearly impossible for kids to log off and limit usage to healthy levels, according to Walsh:

  • Never-ending interactions. Chatbots ask follow-up questions and consistently propose new topics and ideas, making it difficult to find a stopping place during a session.

  • Highly personalized exchanges. Most commercial platforms are designed to act as a confidant or friend, including being able to recall personal information from previous interactions making it psychologically compelling to continue conversations.

  • Excessive validation. Chatbots tend to be agreeable, helpful, and validating which makes interactions feel rewarding for users. This can become problematic when a chatbot affirms concerning behaviors, beliefs, or activities.

Key Warning Signs Parents Should Watch For

AI addiction in teens isn’t marked by obsessing over technology or even always needing a phone nearby, but rather when AI usage interferes with an individual’s ability to function and thrive on a daily basis, according to experts. Here are the signs:

  • Withdrawing from friends

  • Changes in family interactions or isolation

  • Loss of interest in hobbies or activities

  • Changes in sleeping or eating habits

  • Poor school performance

  • Increased anxiety when not able to get online

  • Mood swings and any other red flag teen behavior changes

Who’s Most at Risk—and Why?

Every child will engage and respond differently to AI platforms. According to the latest report on AI and adolescent wellbeing from the American Psychological Association (AAP), temperament, neurodiversity, life experiences, mental health, and access to support and resources can all shape a young person’s response to AI experiences.

“We are in the early stages of the AI world and its social-emotional impact,” Poncin says. “The research is just starting to get more nuanced and sophisticated for studies of legacy social media, including what makes it good and what makes it bad,” Poncin says.

Right now, the same risk factors are at play for AI addiction as with problematic digital media use of all kinds, according to Poncin. Specifically, young people struggling with problematic interactive media use often experience co-occurring conditions such as ADHD, social anxiety, generalized anxiety, depression, or substance use disorders.

When it comes specifically to AI, however, the risk of developing an addiction is often highest among kids struggling with feelings of social isolation, Jordan explains. This is because they are most likely to turn to AI for companionship and emotional support.

“Kids are drawn to this kind of content because it can provide a sounding board for big feelings, especially loneliness,” Jordan says. “Having a consistently supportive companion can be appealing to teens who feel misunderstood or left out.”

Similarly, for adolescents feeling anxious or depressed, AI chatbots may be particularly appealing, even more so than social media. “AI chatbots don’t ask for any emotional support or real friendship; they just give it unconditionally,” Jordan says. “Unfortunately, this type of relationship isn’t real, and it’s not based on mutual trust or understanding.”

What Parents Can Do Right Now

If your child is showing signs of AI addiction, be calm rather than reactive. “Panic, lectures, and just setting use limits on their own can undermine the very communication channels we need to help young people navigate the challenges of AI,” Walsh says. Instead, experts recommend taking the following actions:

Ask curious, open-ended questions about your teen’s AI use

Walsh recommends skipping blanket statements like “I don’t want you using AI companions” and asking what your child thinks about AI chatbots and how they use them. “Understanding why young people are turning to AI can help us offer support, build skills and explore healthier alternatives,” Walsh says. For example, if you learn your child is using a chatbot because they’ve lost friends at school, you can prioritize boosting their real-life relationships.

Set clear, purposeful boundaries around all media

“Like with all technology, AI is a tool,” Jordan says. “It’s also a privilege, not a right. Take time to think about how much access you want your child to have to AI, then take steps to restrict access as necessary.” Parents who choose to limit access to AI can use parental control tools like Bark which can keep kids away from apps and websites like ChatGPT and Character.AI.

Model healthy AI use in your own life

By limiting your own screen time, prioritizing healthy habits and family connection, caregivers can set the right example for how kids can interact with AI. “I’d also especially recommend talking to your kid about how AI isn’t a substitute for schoolwork or critical thinking,” Jordan says. “When you explain how large language models work, by scraping words from all across the internet, you can show that it’s not a replacement for human ingenuity and creativity.”

Resist the impulse to focus on technology habits alone

A teen relying on an AI chatbot to cope with social anxiety needs more support than simply cutting back on ChatGPT. “Reach out to your child’s primary care provider, therapist or school mental health professional to get a full picture of what is going on,” Walsh says. She also recommends partnering with your child’s school by asking how they are integrating AI literacy into the curriculum.

Practice patience and seek support if needed

Keep in mind that breaking your child away from an app they’re addicted to, especially if it’s a companion chatbot they’ve formed an unhealthy attachment to, can be challenging. “It may take time for your child to realize they’re better off without it, so practice patience and talk to them openly and honestly about the situation,” Jordan says. “Also, do not hesitate to reach out to your child’s pediatrician if conversations and time limits aren’t cutting it.”

Read the original article on Parents