#SocialMediaResponsibly: The AI emotional crutch and how schools are fighting tech-mediated loneliness

Education experts are increasingly flagging the emotional turn in how young users interact with AI.

By  Indrani BoseOct 16, 2025 9:15 AM
#SocialMediaResponsibly: The AI emotional crutch and how schools are fighting tech-mediated loneliness

Note to readers:

In an always-on world, the way we use social media shapes not just our lives but the safety and wellbeing of those around us. 'Social Media, Responsibly' is our commitment to raising awareness about the risks of online recklessness, from dangerous viral trends to the unseen mental and behavioural impacts on young people. Through stories, conversations and expert insights, we aim to empower individuals to think before they post, to pause before they share, and to remember that no moment of online validation is worth risking safety. But the responsibility does not lie with individuals alone. Social media platforms must also be accountable for the environments they create, ensuring their tools and algorithms prioritise user safety over virality and profit. It’s time to build a culture where being social means being responsible - together.

“Students today talk to AI not just for homework, but when they feel unsure or anxious,” says Bob Chopra, the nine-year-old Founder and CEO of IvySchool.ai, a platform that teaches Computer Science and Entrepreneurship through AI-based learning inspired by global institutions like Harvard, MIT, Stanford, Wharton, and Duke.

At IvySchool.ai, Chopra says their in-house chatbot BobAI is designed not just to guide learning but to foster emotional growth responsibly. “When emotions come up, BobAI encourages students to fall back on the real Bob — their teachers and mentors for genuine connection. We also offer free yoga and spiritual wellness classes so students can manage stress offline. Our goal is to help kids grow smarter and calmer — using AI as a guide, not a crutch.”

That balance — between curiosity and emotional reliance is becoming one of the biggest challenges for educators in the AI era. “Curiosity sounds like ‘How do I code this?’, while emotional reliance sounds like ‘I can’t think without BobAI.’ When we notice the latter, the system suggests reflection breaks or alerts a teacher,” Bob adds.

Emotional Reliance and Tech-Mediated Loneliness

Education experts are increasingly flagging the emotional turn in how young users interact with AI. Fatema Agarkar, educationist and advisor to the Finland International School Board, calls it “an alarming trend.”

“Children who are too afraid to consult adults or peers for fear of being called out often resort to AI. It feels safe, less threatening, and less embarrassing — and that’s precisely where the problem begins,” says Agarkar.

She believes the solution lies in deep teacher engagement. “A mature and skilled teacher will always observe patterns and maintain close communication with parents and welfare teams. When curiosity turns into dependence, it’s usually visible in participation, writing, or even subtle behavior changes. But spotting that requires effort, empathy, and constant dialogue.”

From Digital Literacy to Emotional Literacy

Jyothi Malhotra, Principal at The Somaiya School, adds that it’s not just children — “even adults are turning to AI for emotional support.” The difference, she warns, is that adults may have the maturity to filter AI advice, while students might mistake generic empathy for genuine understanding.

“When independent thinking begins to fade and students start doubting their own ideas, that’s when curiosity has morphed into emotional reliance,” she says. “Educators must stay alert to this shift — it’s visible in reduced ideation and confidence.”

Building Guardrails Through Families and Teachers

The consensus across educators is that schools must actively involve parents — not through surveillance, but collaboration.

Agarkar suggests “essential agreements” and ongoing conversations: “Schools, homes, and students need to be on the same page. The goal is shared boundaries, not control. Workshops, podcasts, or expert-led sessions can help families understand evolving risks and best practices.”

Chopra echoes this through IvySchool.ai’s parental insights dashboard — which provides learning trends without exposing private conversations. “Families should see AI learning as a shared experience, not a secret one,” he says.

Training Teachers to Read the Digital Pulse

The next frontier, say educators, is preparing teachers for this new kind of tech-mediated loneliness. “Teachers now need both AI fluency and emotional sensitivity,” says Bob. “They must know when to let AI guide and when the real Bob needs to step in.”

Agarkar highlights programs like Switch4Schools, which help track students’ emotional shifts through data. “Tech can actually support emotional awareness — it’s about training teachers to interpret signals and act in time.”

Malhotra agrees. “Educators must understand not only what AI can do but also how it shapes perception and self-worth. Knowing how algorithms behave and where bias creeps in can help teachers intervene early and protect student well-being.”

The Way Forward

As Niru Agarwal, Managing Trustee of Greenwood High International School, sums it up:

“Conversational AI is now part of students’ emotional landscape — sometimes as a study buddy, sometimes as a substitute for human connection. Schools must treat this shift as both a mental-health and safety concern. Teaching digital literacy, emotional resilience, and empathy-driven AI use is no longer optional.”

The classroom of the future will be shaped not by how well students use AI, but by how wisely they learn to turn back to people.

What You Can Do: Practical Steps for Safer AI Use

For Parents and Families

Open conversations early: Talk about self-worth, validation, and peer pressure before children start using social media, AI tools and chatbots.

Model digital responsibility: Kids copy what they see—show them mindful sharing in your own habits.

Set healthy boundaries: Delay access to platforms where possible, and encourage offline hobbies that build confidence.

Be approachable, not just watchful: Kids are more likely to share concerns if they don’t fear punishment.

For Schools and Educators

Integrate digital literacy: Teach students how algorithms work and how online behavior shapes mental health.

Create alternative stages: Give students offline platforms—debates, art exhibitions, performances—where they can shine.

Collaborate with experts: Invite mental health professionals and cyber psychologists for awareness workshops.

For Platforms and Policymakers

Age-gating enforcement: Ensure stronger checks on underage users.

Awareness and transparent reporting: Regularly create awareness about harmful online behaviors, disclose impacts and removal of harmful content. Disclose how AI is improving and algorithms are being adjusted.

Support cultural shifts: Highlight and reward safe, creative challenges to reset what online behaviors and AI tools' use looks like.

Social media doesn’t have to be a stage for reckless validation. By combining personal responsibility, family guidance, institutional education and platform accountability, we can build a culture where being seen online doesn’t come at the cost of being safe offline.

First Published on Oct 16, 2025 9:05 AM

More from Storyboard18