Two major deepfake pornography websites have recently begun blocking access from users in the United Kingdom. This decision follows the announcement of a new law by the UK government that will criminalize the creation of nonconsensual deepfakes.
The rise of nonconsensual deepfake pornography websites and apps that manipulate photos to remove clothing has been a growing concern, particularly due to the harm inflicted on the women targeted by these actions.
Professor Clare McGlynn from Durham University views this development as a significant step in combating deepfake abuse. She believes that this move will disrupt the easy access and normalization of deepfake sexual abuse material.
The emergence of deepfake technology in 2017 has led to the creation of nonconsensual sexual images, primarily targeting women by inserting their faces into pornographic videos or generating fake nude images. The accessibility of this technology has resulted in the proliferation of numerous websites and apps, with instances of schoolchildren creating inappropriate content of their classmates.
Reports indicate that the deepfake websites in the UK are now inaccessible, with notices appearing on their landing pages notifying visitors of the block. The reasons behind these restrictions, whether legal orders or temporary measures, remain unclear.
The UK’s communications regulator, Ofcom, has the authority to take action against harmful websites under existing online safety laws, although these powers are still in the consultation phase.
The implementation of these restrictions is expected to reduce the number of individuals in the UK engaging with or creating deepfake sexual abuse content. Data from Similarweb shows that the larger of the two websites had 12 million global visitors last month, with 500,000 visitors from the UK.