Jasper Ward
Feb 4 (Reuters) – The United Nations Children’s Fund called on countries on Wednesday to criminalize artificial intelligence-generated child sexual abuse content, saying it was alarmed by reports of an increase in the number of artificial intelligence images that sexualize children.
The agency also urges developers to implement safe design methods and guardrails to prevent misuse of AI models. Digital companies should strengthen content moderation by investing in detection technology to prevent the spread of these images, the report said.
“The harm caused by the misuse of deepfakes is real and urgent. Children can’t wait for the law to catch up,” UNICEF said in a statement. Deepfakes are artificial intelligence-generated images, videos and audio that convincingly imitate real people.
UNICEF has also raised concerns about so-called “nudity” of children, where artificial intelligence is used to remove or alter clothing in photos to create fabricated nude or sexualized images.
According to UNICEF, in the past year, at least 1.2 million children in 11 countries have revealed that their images were manipulated into sexually explicit deepfakes.
Britain said on Saturday it planned to make it illegal to use artificial intelligence tools to create images of child sex abuse, becoming the first country to do so.
Concerns about the use of artificial intelligence to generate child abuse content have grown in recent years, particularly chatbots such as Elon Musk-owned xAI’s Grok, which has come under scrutiny for producing pornographic images of women and minors.
A Reuters investigation found that chatbots continued to generate these images even when users explicitly warned subjects that they did not consent.
xAI said on January 14 that it was limiting image editing by Grok AI users and blocking users from generating images of people in revealing clothing in “jurisdictions where this is illegal” based on their location. It did not identify those countries. Earlier, it was limited to paying subscribers to use Grok’s image generation and editing features.
(Reporting by Jasper Ward in Washington; Editing by Michelle Nichols and Rod Nicholl)