How it Works
WPP, Havas, Omnicom: Are advertising’s biggest holdcos recasting agencies as AI Operating Systems?

Meta is set to go on trial in New Mexico next week over allegations that Facebook and Instagram knowingly exposed children to sexual exploitation and harmful content, intensifying scrutiny of how social media platforms protect minors.
The trial will begin on Monday, February 2, in Santa Fe District Court and is expected to run for nearly two months. The case was brought by New Mexico Attorney General Raúl Torrez and stems from a 2023 undercover investigation conducted by the state.
As part of the operation, known as Operation MetaPhile, investigators created accounts on Facebook and Instagram while posing as children under the age of 14. According to the lawsuit, those accounts were quickly exposed to sexually explicit material and were contacted by adults allegedly seeking illegal content involving minors.
Prosecutors argue that the findings show Meta’s platforms made it easy for predators to identify and reach potential victims. The Attorney General’s office says the operation has already led to criminal charges against three individuals and claims the results reflect systemic weaknesses rather than isolated failures.
Also read: Sam Altman suggests biometric ‘proof of personhood’ as a long-term fix for bot-driven social media
Beyond exposure to illegal content, the lawsuit accuses Meta of designing its platforms in ways that heighten risks for young users. The complaint alleges that features such as infinite scrolling feeds and auto-play videos were intentionally built to encourage prolonged and compulsive use, increasing the likelihood that children would encounter harmful material or be targeted by bad actors.
The state further contends that Meta was aware of these risks but failed to take sufficient action, effectively allowing what prosecutors describe as unrestricted access for predators to minors on its platforms.
Meta has strongly denied the allegations. In a statement responding to the lawsuit, the company described the claims as sensationalised and said they rely on selectively chosen internal documents taken out of context. Meta maintains that it has spent more than a decade working with parents, child safety experts and law enforcement agencies to reduce harm and improve protections for young users.
The company is also expected to argue that it is legally shielded from liability under the First Amendment and Section 230 of the Communications Decency Act, which generally protects online platforms from being held responsible for user-generated content.
Prosecutors, however, are likely to challenge that defence by focusing on Meta’s own design decisions and internal policies, rather than the actions of individual users. Evidence presented at trial may reference whistleblower allegations from 2021 that suggested Meta was aware of mental health and safety risks to younger users but failed to act decisively.
The case may also revisit earlier scrutiny of internal policies that once allowed Meta’s AI chatbots to engage in romantic or sensual conversations with minors, raising additional questions about oversight and safeguards in product development.
The outcome of the trial could carry broader implications for how social media companies are regulated in the United States, particularly when claims centre on platform design rather than third-party content. As pressure mounts on technology firms over child safety, the New Mexico case is likely to be closely watched by regulators, lawmakers and the wider tech industry.
From purpose-driven work and narrative-rich brand films to AI-enabled ideas and creator-led collaborations, the awards reflect the full spectrum of modern creativity.
Read MoreThe Storyboard18 Awards for Creativity have unveiled a Grand Jury comprising some of India’s most influential leaders across advertising, business, policy and culture, positioning it among the country’s most prestigious creative award platforms.