How it Works
Tech layoffs 2025: The biggest job cuts in Silicon Valley and beyond
Note to readers:
In an always-on world, the way we use social media shapes not just our lives but the safety and wellbeing of those around us. 'Social Media, Responsibly' is our commitment to raising awareness about the risks of online recklessness, from dangerous viral trends to the unseen mental and behavioural impacts on young people. Through stories, conversations and expert insights, we aim to empower individuals to think before they post, to pause before they share, and to remember that no moment of online validation is worth risking safety. But the responsibility does not lie with individuals alone. Social media platforms must also be accountable for the environments they create, ensuring their tools and algorithms prioritise user safety over virality and profit. It’s time to build a culture where being social means being responsible - together.
When a teenager dangles from a moving train for the perfect selfie or balances on a high-rise ledge to film a reel, the act is rarely about thrill alone. At its core lies something far deeper: a search for validation, an insatiable need to be seen, admired, and remembered in an ever-scrolling digital universe.
Dr Harish Shetty, a mental health advocate, describes today’s world as a “huge stage colored with anonymity.” Every individual, especially teens, wants to be under the arclights, either naturally or through the artificial glow of selfies. “The selfie is a proclamation that I exist. Look at me, I am good, great, adorable and warm,” he says. In this quest, external approval becomes synonymous with self-worth.
What makes it particularly dangerous is the transformation of a simple selfie into a test of courage and superiority. For many young people, taking a risky picture or video is not just documentation, it is a way to outshine peers and fill emotional voids. Dangerous backdrops, filters, and lighting infuse them with what Dr Shetty calls “artificial self-worth.” The adrenaline rush that comes with likes and shares, compounded by dopamine kicks, makes them feel invincible. “The predominant feeling is: I can do it that others can’t. I outshine everyone. I am the best.”
From Selfies to Stunts: The Dopamine Trap
Nirali Bhatia, cyber psychologist, notes that the phenomenon has shifted from thrill-seeking to identity-seeking. Social media is not just a pastime, it has become an extension of personality. Visibility is equated with validation and, increasingly, with status. Going viral is perceived as a measure of success, even if it comes at the cost of safety.
At a neurological level, this is reinforced by the brain’s reward system. Validation-seeking drives the impulse to be noticed. Peer pressure makes risky acts appear acceptable. And dopamine, released each time a like or view is received, creates a reinforcement loop. Bhatia points out that reckless behaviors, whether dangerous stunts or edgy, sexualized content, tend to attract more attention. Algorithms reward that attention by amplifying it, turning it into a cycle that reshapes young people’s sense of normalcy. “If your feed is full of such videos, reckless acts are seen as aspirational rather than dangerous,” she says.
The Legal Framework
The law is catching up, though unevenly. Under the Bharatiya Nyaya Sanhita (BNS), rash and negligent acts that endanger human life, including dangerous stunts for content, fall under multiple sections: S281 (rash driving), S282 (causing hurt by rash or negligent act), and S283 (causing grievous hurt). If such stunts cause public nuisance or obstruction, S286 applies. Moreover, if influencers encourage followers to imitate them, BNS S111 on abetment extends liability to them.
Prashant Mali, advocate, explains that the cycle of likes and highs also carries legal risk. “The likes add to the noradrenaline high, and a failure to get that forces them to walk on the feather-thin edge again and again till they get the huge highs. The feedback loops reinforce it,” he says.
On the platform side, the IT Act, 2000 provides intermediaries with safe harbour, shielding them from liability unless they knowingly host unlawful content or ignore takedown notices. The IT Rules, 2021 go further: intermediaries are obligated not to promote content endangering life, while Significant Social Media Intermediaries like YouTube and Instagram are required to ensure algorithmic transparency and risk mitigation. Courts too have stepped in, as in the Madras High Court’s interim order banning TikTok downloads after a spate of dangerous content cases.
Platforms Under Scrutiny
The role of social media companies is increasingly under the spotlight. Algorithms thrive on engagement, and risky content often drives it. Bhatia points to a landmark UK case where Pinterest was sued after a teenager’s suicide; the court held the platform accountable for repeatedly surfacing self-harm content. “If risky content is getting engagement, platforms should not amplify it further. That kind of accountability is essential,” she stresses.
Platforms can intervene by reshaping what validation looks like. For instance, younger users could be nudged toward creative challenges, humor, or talent-based content rather than dangerous acts. Algorithmic guardrails could be designed to deprioritize harmful trends while still allowing visibility for safer forms of expression.
Schools, Families, and Collective Action
Yet, technology is only part of the solution. Experts highlight the absence of internal grounding. Dr Shetty laments that schools and families rarely introduce children to “the internal journey of peace and harmony.” Instead, self-worth is externalized, hinging on likes, shares, and peer comparisons. Without inner resilience, the dopamine-fuelled cycle of validation becomes even harder to resist.
Nirali Bhatia emphasizes that unless society responds collectively, the trend will continue. She advocates for stronger interventions, including delaying access to social media until the age of 18 or ideally 21, when the brain’s prefrontal cortex is more developed. “Social media at an earlier age creates addictive patterns, social isolation, and other challenges,” she warns.
Preventive education is another missing piece. Awareness campaigns in schools can help children question online trends before peer pressure peaks. Parents, too, need to shift from monitoring alone to actively talking about self-worth, risk, and the illusion of online popularity.
A Cultural Shift Needed
At the heart of this crisis lies the cultural obsession with visibility. For many young people, the value of a moment is measured by how it plays online, not by how it feels offline. Dangerous reels and selfies thrive because society collectively rewards them with attention.
Curbing reckless validation-seeking will require a cultural shift. Platforms must be held accountable for their algorithms. Schools and parents must foster inner resilience. Legal frameworks must treat dangerous acts not as harmless mischief but as threats to life. Only then can the cycle of risky validation be broken.
Until that happens, the arclights of social media will continue to draw young people into perilous performances, chasing likes at the cost of their safety.
What You Can Do: Practical Steps for Safer Social Media
For Young Users
Pause before posting: Ask yourself—would I still do this if nobody was watching?
Choose safer ways to stand out: Showcase creativity, humor, or talent instead of risky stunts.
Curate your feed: Follow accounts that inspire, educate, or entertain positively. Unfollow or mute pages that glorify dangerous trends.
Check the source: If a challenge looks risky, it probably is. Don’t trust trends just because they’re viral.
For Parents and Families
Open conversations early: Talk about self-worth, validation, and peer pressure before children start using social media.
Model digital responsibility: Kids copy what they see—show them mindful sharing in your own habits.
Set healthy boundaries: Delay access to platforms where possible, and encourage offline hobbies that build confidence.
Be approachable, not just watchful: Teens are more likely to share concerns if they don’t fear punishment.
For Schools and Educators
Integrate digital literacy: Teach students how algorithms work and how online behavior shapes mental health.
Create alternative stages: Give students offline platforms—debates, art exhibitions, performances—where they can shine.
Collaborate with experts: Invite mental health professionals and cyber psychologists for awareness workshops.
For Platforms and Policymakers
Algorithmic guardrails: Stop amplifying content that glorifies danger.
Age-gating enforcement: Ensure stronger checks on underage users.
Transparent reporting: Regularly disclose how much harmful content is removed and how algorithms are being adjusted.
Support cultural shifts: Highlight and reward safe, creative challenges to reset what “going viral” looks like.
Social media doesn’t have to be a stage for reckless validation. By combining personal responsibility, family guidance, institutional education and platform accountability, we can build a culture where being seen online doesn’t come at the cost of being safe offline.
Today’s B2B marketers wear many hats: strategist, technologist, and storyteller.
Read MoreThe Online Gaming Bill 2025 imposes severe penalties, allows warrantless search and seizure, and empowers a central authority to regulate the digital gaming ecosystem. It is expected to disrupt platforms, payment systems, and advertising in the sector. Here's all you need to know about the bill.