AI Counselors in Schools: Tracking Mental Health or Invading Privacy? (2026)

The future of mental health support in schools is here, and it's raising some big questions.

In a world where AI is becoming increasingly integrated into our lives, schools are now turning to artificial intelligence to monitor and support students' mental well-being. But is this a step too far, or a necessary innovation to address the growing mental health crisis among young people?

A New Approach to an Old Problem

Meet Brittani Phillips, a middle school counselor in Putnam County, Florida. She's on the front lines of this new AI-assisted approach to student support. Phillips receives alerts from an AI-powered therapy platform, which students use outside of school hours. The platform flags potential risks of self-harm or harm to others based on students' chat messages.

One evening, Phillips received a severe alert about an eighth-grader. She spent the night on the phone with the student's mom, trying to understand the situation and assess the risk. She also called the police, a step she takes when confidentiality can no longer be maintained.

This incident, which occurred last school year, ended positively. The student is now in ninth grade and, according to Phillips, the interaction built trust between her and the family. The student even greets Phillips when they pass in the halls.

Phillips' school, Interlachen Jr-Sr High, is using an AI platform to address budget shortfalls and a lack of mental health staff. The school has been using the automated student monitoring system, Alongside, for three years. It's part of a growing trend, with at least nine companies securing funding deals since 2022 to provide similar tools to K-12 schools.

Alongside claims its platform is used by over 200 schools across the US and offers better services than typical telehealth options. It provides a social and emotional skill-building chat tool, where students can chat with a llama named Kiwi about their life problems, learning resilience along the way. The content generated by the AI is monitored by clinicians, the company says, providing critical mental health resources to schools, especially those in rural areas.

Controversy and Concern

But here's where it gets controversial. While AI in education is a key component of the Trump administration's national education agenda, it's not without its critics. Parents, educators, and lawmakers are wary of increasing screen time for teens. States have even started restricting the use of AI in telehealth.

Many experts and families worry about students becoming too attached to AI. A recent national survey found that 20% of high schoolers have used AI romantically or know someone who has. This has led to proposals for federal laws that would require AI companies to remind students that chatbots are not real people.

Despite these concerns, Phillips believes the tool her school uses is exceptional at handling smaller issues. With around 360 middle schoolers to support, the tool allows her to focus her time on students facing more severe crises. Students often find it easier to turn to AI for emotional support, she says.

The Comfort of AI

School counselors attribute students' comfort with confiding in AI technologies to their nervousness and the familiarity of chat interfaces. Speaking with a mental health professional can be intimidating, especially for adolescents, says Sarah Caliboso-Soto, a licensed clinical social worker.

There's also a generational aspect. Students who have grown up with chat interfaces on social media and websites find AI interfaces more familiar. Texting is often easier than calling for today's youth, says Linda Charmaraman, director of the Youth, Media & Wellbeing Research Lab.

Using AI also allows students to avoid watching facial expressions, which they may worry will carry judgment. Chatbots are available at all hours without the need for an appointment, Charmaraman adds.

A First Line of Defense?

Caliboso-Soto believes AI can be used as a first line of defense in resource-strapped schools. It can regularly check in with students and direct them to the right help when needed. However, she warns that AI should not replace human counselors. It lacks the discernment and judgment that human clinicians provide.

While large language models can be trained to notice symptoms in text, they cannot see or hear the nuances that a human can. AI cannot reliably catch subtle observations or behaviors, and it cannot replace human connection, Caliboso-Soto emphasizes.

Charmaraman agrees, saying that while AI can speed up the diagnostic process and free up time for school counselors, it's crucial not to over-rely on it for mental health support. The technology can miss important nuances and give students unrealistic positive reinforcement.

Schools need to adopt a holistic approach that includes families and caregivers, she argues. It's important to pay attention to whether students are having less frequent contact with clinically trained humans as AI intervention becomes more common.

A Stepping Stone or a Crutch?

Alongside representatives argue that their platform is not meant to replace human therapy. Ava Shropshire, a youth adviser for Alongside, says the app is a stepping stone to seeking help from adults. It makes mental health and social-emotional learning feel more normal for students and can lead them to seek human help.

However, some students and organizations, like the Young People's Alliance, view AI as a Band-Aid solution. Sam Hiner, the executive director of Young People's Alliance, argues that technology and social media have manipulated and isolated students, leading to a deep yearning for community and belonging.

Students will seek connection wherever they can, even through ChatGPT, Hiner says. The organization has released a framework for regulating AI, allowing for some therapeutic uses, but is generally striving to rebuild human community and is against AI replacing human companionship.

Hiner's main concern is the development of parasocial relationships, where students develop one-sided emotional attachments to AI. While AI can provide feedback and analysis, even in mental health contexts, it should not hint at having its own emotional state, as this encourages attachment.

Hiner believes that platforms often claim to decrease loneliness but fail to measure long-term fulfillment and connection. What advocates want to prevent is these bots fueling the loss of social skills and pulling people away from relationships with others, where they have social accountability.

Privacy and Oversight

Privacy experts note that these chatbots do not carry the same privacy protections as conversations with licensed therapists. When concerns about student privacy and encounters with the police are already high, the use of these tools raises messy privacy concerns, even when supervised by clinically trained individuals.

Both the company and Phillips emphasize the need for human oversight for these systems to work effectively. Phillips feels the tool is an improvement over other monitoring tools, which often led to in-school discipline rather than mental health support.

As of February this school year, Phillips noted 19 severe alerts from the AI health tool, out of a total of 393 active users. Some students are causing multiple of these alerts, she notes.

Phillips has also learned that it takes a human to perceive teenage humor. Middle school students, usually boys, will sometimes test the boundaries of the technology, typing comments like "my uncle touches me" or "my mom beat me with a pole" to see if anyone will follow up.

These boys are testing to see if anyone is listening and if anyone cares, Phillips says. When she discusses these incidents with the students, she can observe their body language and determine if the comment was real or a joke. If it was a joke, they often become apologetic.

Phillips believes that by keeping an eye on these interactions, students learn to trust that she's actively monitoring the system. And the number of boys testing the system in this way decreases each year.

Final Thoughts

The use of AI counselors in schools is a complex issue, raising questions about privacy, attachment, and the role of human connection in mental health support. While AI may offer some benefits, it's clear that human oversight and a holistic approach are crucial. As we navigate this new frontier, the balance between innovation and human connection will be key.

What are your thoughts on this new approach to student mental health support? Is it a necessary step forward or a dangerous path to tread?

AI Counselors in Schools: Tracking Mental Health or Invading Privacy? (2026)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Melvina Ondricka

Last Updated:

Views: 6496

Rating: 4.8 / 5 (68 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Melvina Ondricka

Birthday: 2000-12-23

Address: Suite 382 139 Shaniqua Locks, Paulaborough, UT 90498

Phone: +636383657021

Job: Dynamic Government Specialist

Hobby: Kite flying, Watching movies, Knitting, Model building, Reading, Wood carving, Paintball

Introduction: My name is Melvina Ondricka, I am a helpful, fancy, friendly, innocent, outstanding, courageous, thoughtful person who loves writing and wants to share my knowledge and understanding with you.