Child abuse picturesPicture copyright
Getty Images

A survey in the UK concluded that internet companies must do more to deal with the "explosion" of child sexual pictures on the platform.

The panel also said that technology companies "failed to prove" that they were fully aware of the number of adolescents under the age of 13 using their services and lacked plans to address the issue.

It requires all images to be filtered before publishing.

It said more stringent age checks were needed.

"Reputation damage"

Facebook, Instagram, and Snapchat are considered the most cited applications for retouching.

The entire industry has been accused of being "proactive rather than proactive" to address these issues.

The survey said: "It appears that action is being taken to avoid reputational damage and not to protect first."

The report is as follows During a series of public hearings held between January 2018 and May 2019, police said they believed the UK was the third largest consumer of child sexual in the world.

& # 39; evil crime & # 39;

Facebook was one of the first companies to respond.

"We have made a huge investment in complex solutions," said David Miles, the company's head of European security.

"Since this is a global issue that involves the entire industry, we will continue to develop new technologies and work with law enforcement and child protection to ensure the safety of children."

Microsoft also promised to "think these findings carefully", while Google said it would continue to work with others to "solve this evil crime."

Illegal image

The report said that some measures should be taken by the end of September.

Filtering before the image is displayed online must dominate the list.

The report states that technologies such as Microsoft's PhotoDNA allow people to quickly check pictures against a known database of illegal images without having to manually view them.

But currently, this filtering process is usually performed after the material is available for others to view.

Official ban

The panel said that users may be frustrated because they can't see their content, but the panel said that before the release, the process could not be completed without any technical reason.

The investigation also said that the British government should introduce legislation to force companies to take more effective inspection measures to prevent the use of minors.

Pre-pubertal are at risk of being "particularly acute" to be modified.

The group recognizes that many services are officially banned from use by under the age of 13.

Child nude

But it says that in many cases, the only test is to ask the user to fill out a date form, which can easily be faked.

It said: "There must be better ways to ensure compliance."

The report acknowledges that it is difficult to detect and stop the spread of in real time, but highlights a French app as an example of learning.

It said Yubo uses an algorithm to detect possible instances of child nudity, and then a human host checks to see if action is necessary.

Picture copyright


Some suggest that large social networks can learn from smaller rival Yubo

The expert group also pointed out that when communications are protected by end-to-end encryption, existing anti- technologies do not work properly. End-to-end encryption enables digitally encrypted communications without the need to provide a platform provider key.

The survey highlighted WhatsApp, Apple's iMessage and FaceTime already use the technology by default, and Facebook intends to deploy it more widely as soon as possible.

However, it did not say how it should be resolved.

See also  Twitter space is expected to enter the desktop web browser