

Ofcom has issued a £1.35 million fine to 8579 LLC, a company operating pornographic websites, due to its failure to implement adequate age verification measures.
The media regulator determined that the company’s sites lacked “highly effective” systems to confirm that UK visitors were over 18, thus failing to prevent children from accessing adult content.
This penalty represents Ofcom’s largest fine imposed under the Online Safety Act (OSA) to date. The investigation into 8579 LLC commenced shortly after age verification rules came into effect in July 2025.
In addition to the primary fine, 8579 LLC received a further £50,000 penalty for not responding to Ofcom’s requests for information.
Ofcom stated that the company did not implement effective age checks on most of its adult sites between July 25 and at least November 19, 2025.
The regulator has mandated that 8579 LLC must implement robust age verification on one remaining site by Monday at 17:00 GMT, or face an additional daily fine of £1,000.
George Lusty, Ofcom’s director of enforcement, emphasized the regulator’s clear stance that adult sites must deploy strong age checks to protect children in the UK from viewing pornography. He added that companies failing to comply or ignoring legally binding requests should anticipate fines.
Ofcom also requires 8579 LLC to provide a comprehensive list of all websites it operates, following its failure to respond to previous information requests. Non-compliance with this demand could result in an additional daily fine of £250.
The regulator has previously initiated investigations into numerous porn sites lacking age checks, resulting in decisions and fines for some operators.
For instance, in December, AVS Group Ltd was fined £1 million for ongoing non-compliance with the OSA. Ofcom later confirmed that AVS had subsequently introduced age checks on some of its pornographic sites.
Separately, on February 2, Pornhub began restricting access to its website for UK users. Aylo, Pornhub’s parent company, commented that the OSA had “not achieved its goal of protecting minors” and had instead “diverted traffic to darker, unregulated corners of the internet.”
The Online Safety Act establishes a framework of laws and duties that online platforms must adhere to, with Ofcom responsible for its implementation and enforcement.
Under the Act’s Children’s Codes, platforms are also required to prevent young people from encountering harmful content, including material related to suicide, self-harm, eating disorders, and pornography.



