‘Nudification’: Government confirms plan to ban certain software in war on violence against women and girls
It already is a criminal offense in the United Kingdom to create explicit deepfake images of people without their consent.
Those are fabricated posts that put a real person’s face on a fabricated nude body image.
But now, in an effort to tackle violence against women and girls, the government has confirmed a further step: A ban on those software programs that provide “nudification” services.
“Women and girls deserve to be safe online as well as offline,” explained government Technology Secretary Liz Kendall. “We will not stand by while technology is weaponized to abuse, humiliate and exploit them through the creation of non-consensual sexually explicit deepfakes.”
Those found in violation, she said, “will feel the full force of the law.”
The new plan will take the law further than it already goes, making it criminal to have certain software.
“The act of making such an image is rightly illegal – the technology enabling it should also be,” explained Dame Rachel de Souza, the nation’s children’s commissioner, earlier.
Kerry Smith, of the Internet Watch Foundation, which targets child sexual abuse online, welcomed the measures, commenting: “We are also glad to see concrete steps to ban these so-called nudification apps which have no reason to exist as a product,” according to a report at the Christian Institute.
The BBC said the nation’s Online Safety Act already starts to address the problem of software that moves one person’s face to another body, or appears to remove a person’s clothing in an image.
Those processes, the report said, “use generative AI to realistically make it look like a person has been stripped of their clothing in an image or video.”
Experts already have been warning of the serious harm that such actions could inflict.
Tech company SafeToNet said it has software now that can identify and block sexual content, as well as block cameras when they detect sexual content being captured.