A new law in the UK will compel technology companies to remove intimate images shared without consent within two days, or they could face significant fines and have their services blocked.
Victims will reportedly only need to report such images once. The Department for Science, Innovation and Technology (DSIT) stated that images should be removed across multiple platforms simultaneously, with new uploads automatically deleted. This objective may be achieved through digital markings that allow images to be flagged each time they are posted.
This government announcement follows a period where xAI’s Grok chatbot generated millions of “nudified” images in response to user prompts. After widespread criticism, Grok owner Elon Musk announced on January 15 that the platform would block the chatbot from creating such images, two days after Ofcom initiated a probe. Musk also suggested the possibility of using geoblocking for content in jurisdictions where its creation is not illegal.
In an opinion piece published in the Guardian, Prime Minister Keir Starmer described the proliferation of nonconsensual intimate images as a “national emergency.” He emphasized the importance of ensuring that such images are removed from all platforms after a single report.
Starmer highlighted the burden on victims: “Victims have been left to fight alone – chasing takedown of harmful content site to site, reporting the same material again and again, only to see it reappear elsewhere hours later. That is not justice.” He added, “The burden of tackling abuse must no longer fall on victims. It must fall on perpetrators – and on the companies that enable harm.”
The 48-hour deadline will be implemented through an amendment to the country’s Crime and Policing Bill, according to DSIT. Companies failing to meet this deadline could face fines up to 10% of their worldwide revenue, in addition to service blocking in the UK. DSIT also plans to issue guidance for internet providers on blocking hosting sites, targeting “rogue websites” outside the Online Safety Act’s scope.
On Wednesday, Ofcom announced it would fast-track its decision on a proposed measure requiring firms to use “proactive” technology, such as “hash matching,” to block illegal intimate images, including sexual deepfakes, at the source. Hash matching is a digital identification method that creates a unique fingerprint from data to compare images against existing content. Ofcom expects to announce its decision in May, with changes potentially taking effect this summer, pending parliamentary approval.
Furthermore, the government intends to designate the creation or sharing of intimate images without consent as a “priority offense” under the UK’s Online Safety Act. This classification would equate the severity of such actions with child abuse images or terrorism content.

