Image source, Getty Images
Caption for the image A whistleblower claimed that Facebook Groups were where many of the “abhorrent behaviors” are occurring.

US authorities were informed by a former Facebook employee that efforts to delete child abuse materials from Facebook had been inadequate and under-resourced.

These allegations were made in documents that BBC News saw and which were submitted to the US Securities and Exchange Commission two weeks ago.

An anonymous whistleblower claims that moderators “not sufficiently trained” and are not well prepared.

Facebook released a statement saying that they had no tolerance for the child abuse and would use modern technology to stop it.

“We have helped to fund and build tools for industry that can be used to investigate the terrible crimes, rescue children, and bring justice to those who are hurt.”

She also said that she has shared her anti-abuse technologies to other companies.

These revelations follow an earlier testimony by Frances Haugen, an ex-insider to the US Congress that Facebook platforms “harm children and stoke discord and damage our democracy”.

In addition, she gave evidence this week to the UK’s parliamentary committee that was examining the Online Safety Bill.

Evidence will also be presented by senior executives of Tiktok, Google, YouTube, Facebook and Google.

Unnamed whistleblower with knowledge of Facebook teams set up to intercept hazardous material has just revealed his latest revelations.

According to the SEC which regulates and protects investors in securities markets, the person sworn that there wasn’t a solution for illegal material on Facebook, because “adequate resources” were not available.

The claim is that the small group of people who had been working on software to detect child porn was disbanded.

Facebook states it employs technology known as PhotoDNA and VideoDNA. These technologies automatically scan for images of child abuse and refer to the American National Centre for Missing and Exploited children.

Another accusation from a whistleblower is:

  • Facebook isn’t able to track all child abuse materials because it does not know its full extent.
  • Senior managers were reportedly asked the constant question “What’s your return on investment?”

According to the whistleblower, this question was legitimate for business purposes. However, it is not relevant when it comes down to safety concerns such as child sex violence.

The five-page legal document also warned about Facebook Groups, which are described as “facilitating harm”.

Many of these groups can only be seen by members. This is where “a lot more terrifying and abhorrent behaviors occur.”

To describe the kind of paedophile, they use code words. They share the codes via Facebook’s encrypted Messenger or Whatsapp. These codes are constantly changing.

Facebook’s system is dependent on self-policing models that aren’t reasonable or possible to enforce.

Facebook told the BBC that it does scan private groups for content that violates its polices and has 40,000 people working on safety and security, with an investment of more than $13bn (£9.4bn) since 2016.

The company stated that it had taken 25.7 million content to prevent child sexual abuse in the second quarter of 2021.

Sir Peter Wanless is the chief executive officer of NSPCC. He stated: “These revelations raise deep and troubling questions about Facebook’s commitment to fighting illegal child abuse through its services.

Facebook evidence suggests that they are abdicating their responsibility for addressing child sexual abuse materials.

Ex-employee concluded the statement with this: “Unless there’s…the credible threat legislative and/or criminal action. Facebook will not change.

Source: BBC.com

Share Your Comment Below

[gs-fb-comments]

LEAVE A REPLY

Please enter your comment!
Please enter your name here