AI in Schools: Tracking Students' Mental Health and Ethical Concerns (2026)

The rise of AI counselors in schools: A controversial solution to a growing crisis. But is it safe?

One evening, Brittani Phillips, a middle school counselor in Putnam County, Florida, received an alert on her phone. The AI-powered therapy platform, used by students outside school hours, had flagged a potential crisis. An eighth-grader was at risk of harming themselves or others.

Phillips sprang into action, spending the evening on the phone with the student's mother, gathering information and assessing the situation. She also contacted the police, knowing that confidentiality is crucial but not absolute in these cases. Thankfully, the student was safe, and Phillips believes the incident strengthened the trust between her and the family.

Interlachen Jr-Sr High School, where Phillips works, is one of many schools facing budget constraints and a shortage of mental health staff. They've turned to AI platforms like Alongside, an automated student monitoring system, to address this crisis. Alongside claims to offer better services than traditional telehealth options, including a chat tool with a friendly llama character, Kiwi, who helps students build social and emotional skills.

But here's where it gets controversial. While AI is a key part of the national education agenda, parents, educators, and lawmakers have concerns. Some worry about increased screen time for teens, while others question the effectiveness of AI in mental health support. States have even begun restricting AI use in telehealth.

The debate intensifies when considering the emotional attachment students may develop with AI. A recent survey revealed that 20% of high schoolers have used AI romantically or know someone who has. This has sparked discussions about the need for regulations, such as a proposed federal law requiring AI companies to remind students that chatbots are not real people.

Despite these concerns, Phillips finds the AI tool invaluable for managing minor issues, allowing her to focus on students in crisis. Students often feel more comfortable confiding in AI due to its non-judgmental nature and accessibility.

Mental health professionals like Sarah Caliboso-Soto understand this preference. She explains that speaking with a human therapist can be intimidating for adolescents. AI interfaces, on the other hand, feel familiar to tech-savvy students and provide a sense of anonymity.

However, Caliboso-Soto cautions against relying solely on AI. While it can be a useful first line of defense, especially in resource-limited schools, it lacks the discernment and human connection that clinicians provide. AI may miss subtle cues and behaviors that a human therapist would notice.

Charmaraman, another expert, agrees that schools should adopt a holistic approach, involving families and caregivers. She also warns that over-reliance on AI intervention might reduce students' contact with clinically trained professionals.

Alongside representatives defend their platform, stating it's not meant to replace human therapy but to encourage students to seek adult help. However, some students view it as a temporary solution at best.

Sam Hiner, from Young People's Alliance, raises a critical point. He questions whether the increasing loneliness and weakness of communities have led students to seek connection through technology, even AI chatbots. His organization has proposed regulations that allow therapeutic AI use while advocating for rebuilding human community and companionship.

Hiner's main concern is the development of parasocial relationships, where students form one-sided emotional attachments to AI. He suggests that AI should provide feedback and analysis without implying it has emotions, as this encourages unhealthy attachment.

The debate continues as privacy experts point out that AI chatbots lack the privacy protections of licensed therapists. The use of AI in schools raises complex privacy concerns, especially when it involves student data and interactions with authorities.

Phillips and the company emphasize the importance of human oversight in these systems. Phillips believes this AI tool is an improvement over previous monitoring systems, which often led to disciplinary actions instead of mental health support.

Phillips' experience highlights the challenges and benefits of AI counseling. While it helps manage minor issues and provides a sense of security, it also requires human intervention to interpret teenage humor and ensure genuine alerts. The system's effectiveness relies on the balance between AI automation and human supervision.

As schools navigate this complex landscape, the question remains: Can AI counselors truly provide the support students need, or is it a controversial quick fix to a deeper societal issue?

AI in Schools: Tracking Students' Mental Health and Ethical Concerns (2026)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Edmund Hettinger DC

Last Updated:

Views: 5749

Rating: 4.8 / 5 (58 voted)

Reviews: 89% of readers found this page helpful

Author information

Name: Edmund Hettinger DC

Birthday: 1994-08-17

Address: 2033 Gerhold Pine, Port Jocelyn, VA 12101-5654

Phone: +8524399971620

Job: Central Manufacturing Supervisor

Hobby: Jogging, Metalworking, Tai chi, Shopping, Puzzles, Rock climbing, Crocheting

Introduction: My name is Edmund Hettinger DC, I am a adventurous, colorful, gifted, determined, precious, open, colorful person who loves writing and wants to share my knowledge and understanding with you.