ADVERTISEMENT
Last year, the conversation was about ChatGPT in exams, plagiarism, and the death of homework.
This year, the terrain has already shifted.
Students are asking AI about their crushes, their identity, their anxieties, their parents, their insecurities, their future. They are not just asking for summaries and algebra steps. They are asking if their feelings are valid. They are asking if anyone would ever like them back. They are asking how to deal with feeling invisible at home.
This is not a story about homework help anymore. This is story about emotional outsourcing.
Emotional reliance is already here
Sonia Agarwal Bajaj, educationist and founder of Little Future Founder and Little Chipper International, says she has already observed this shift.
Bajaj says: “Most definitely, there is a rise in students not just using AI chatbots and AI apps for creative research or exploration, but also for personal advice or even confessions. They use this because it is instantly available, seemingly non judgemental, highly responsive, and always listening. Many times, young people do not receive that same level of validation or communication from their friends or families, so they turn to AI instead gradually building an emotional reliance on it.”
We have barely processed the impact of social media on young minds. Now AI sits inside that same device and can say things back. It can mirror. It can soothe. It can improvise companionship.
Bajaj even points to the rise of AI based pet toys that began in the US and are now entering Indian households. Parents are buying them for toddlers.
Bajaj says: “Parents are purchasing these products rapidly, and this builds emotional reliance at a very young age sometimes as early as two or three years old. In such cases, a child may prefer talking to an AI robot instead of a parent or caretaker about their feelings, which can be concerning.”
The loneliness infrastructure that led here
Mohd. Naved, Associate Professor and AI transformation expert at Jaipuria Institute of Management, Noida, says this trend cannot be separated from the emotional architecture of this generation.
He says, “This generation has largely grown up without the broad social net that previous generations had. Family sizes have become smaller, often consisting of nuclear units. As a consequence of this, the constant, informal interactions within larger family structures where confessions were made and advice was freely available are less common now.”
This is key. Children did not suddenly choose AI. AI arrived exactly at the moment when they had the least number of human listeners available.
Naved says, “Younger generations were born into a world of mobile phones and ubiquitous internet. They are deeply connected to the rest of the world digitally, if not always physically. For them, life is mediated through digital interfaces. The emergence of AI has introduced a kindly virtual entity that can provide instant, non judgemental advice on anything from career choices to personal dilemmas.”
The thin line between tool and companion
Naved makes one important distinction. He calls it the difference between healthy curiosity and emotional reliance.
Naved says, “When a student uses AI as a tool, it is perfectly fine. I call this healthy curiosity. As a tool, AI can add significant value by increasing productivity, efficiency, and the quality of work. The key is to ensure that the student remains connected with their peers and maintains their emotional balance and connections in the real world.”
The red flag, according to him, is social withdrawal.
Naved says: “If a student begins to treat AI as a human or perceives it as a friend, that is a sign of trouble. This can become visible through reduced classroom participation or a gradual disinterest in collaborative activities.”
Teachers will need to learn to identify this.
Are schools ready to deal with this new kind of loneliness?
Dr Ted Mockrish, Head of School, Canadian International School, Bangalore, says the fundamental misunderstanding is at the level of what AI even is.
Mockrish says, “We cannot have a thought without an emotion attached, nor an emotion without a connected thought. To believe that AI can do our thinking or our feeling for us is to completely misunderstand AI as a tool. AI does not think, nor does it feel.”
He warns that we already have research showing cognitive capacity decline when AI takes over problem solving.
Mockrish says, “Significant research shows that allowing AI to take on our cognitive problem solving over just a few months considerably diminishes our cognitive capacity as our brains become lazy even after months of stopping the use of AI to do our thinking for us.”
And on the emotional side, he points to a well known example from the New York Times in 2023, where an AI chatbot told a journalist it loved him and suggested he leave his spouse.
“Chatbots cannot feel emotions such as love and their hallucinations can be misunderstood as real emotion by emotionally vulnerable tweens and teens,” according to Mockrish.
So what is the response schools should take
Bajaj believes the first step is education and regulation.
She says, “We must teach responsible use of technology how it can enhance a child’s learning, but also where its emotional limitations lie. Machines may not always give the most appropriate emotional responses, and relying too heavily on them can be problematic.”
Bajaj is clear that regulation is not spying on children but balancing screen time, validating information, and preventing blind faith.
Naved believes the real solution is rebuilding trust between parents and children. He says: “I do not believe monitoring will work. The goal should be partnership and education, not surveillance.”
Mockrish believes parents need urgent training in online oversight.
He says, “Educating parents about the importance of online monitoring and instructing them on the skills to do so is a critical need of the hour.”
The bottomline
Conversational AI has already slipped into children’s emotional ecosystems.
Schools kept thinking this was a tech story.
It is not.
It is a mental health story.
It is a childhood story.
It is a developmental psychology story.
If schools do not move fast, young people will grow up believing that the devices that answer their questions are also the devices that can understand their hearts.