Advertising
Global Mergers, Local Ripples: Consolidations reshape India's ad land as clients seek 'single-window' partners
Facebook parent firm Meta has filed a lawsuit against a Hong Kong-based tech company for flooding its websites with sexually explicit and fake nonconsensual images.
According to media reports, Meta sued CrushAI's parent Joy Timeline HK Limited for using artificial intelligence to take images of clothed people and turn them and turned them into nudes.
CBS News reported that Meta has banned the company from advertising its services on its platform.
"We are seeing a concerning growth of so-called ‘nudify’ apps, which use AI to create fake non-consensual nude or sexually explicit images. Meta has longstanding rules against non-consensual intimate imagery, and over a year ago we updated these policies to make it even clearer that we don’t allow the promotion of nudify apps or similar services. We remove ads, Facebook Pages and Instagram accounts promoting these services when we become aware of them, block links to websites hosting them so they can’t be accessed from Meta platforms, and restrict search terms like ‘nudify’, ‘undress’ and ‘delete clothing’ on Facebook and Instagram so they don’t show results," Meta said in a blogpost.
"The legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said in a statement.
"We will continue to take necessary steps- which could include legal action against those who abuse our platforms like this," it added.
Joy Timeline is not the first to misuse the technology, previously other apps have also circumvented ad filters of social media platforms, including Meta to hawk their software.
Meta claimed that the Hong Kong-based app devised several ways of skirting past the ad filter, such as inoffensive imagery to try to fly under the radar.
Experts have warned that there have been at least 10,000 ads promoting nudify apps on Facebook and Instagram platforms.
Threat looms from such software that anyone could feasibly take a photo without consent and turn it into a nude.
Meta said it has banned non-consensual intimate imagery on its platforms, and added it will remove ads for "nudity" apps on its platform.
Last week, Meta said it would work with Tech Coalition's Lantern Program, aimed at tracking sites that break child safety rules. The tech giant will share information with other tech firms about apps, sites, or companies that violate its policies.
As India eyes global leadership in media, entertainment and gaming, Storyboard18's Digital Entertainment Summit, set to take place on June 27 in the capital, will spotlight the bold strategies, policy pathways and creative innovations shaping the future of the industry.
Read MoreFrom the chiefs of Nestle, Diageo, Colgate, PepsiCo, Zetwerk and CRED to AI visionaries, marketing mavens, top creators, ad legends and leading global agencies' CEOs, the brightest minds converged at the Storyboard18 Global Pioneers Summit for an action-packed day of meaningful dialogues on creativity, commerce and culture.