Google And YouTube Ignore Paedophile Flaggers

Google And YouTube Ignore Paedophile Flaggers.

Argos, Talktalk and Deutsche Bank do not have much in common, but they do have this: all have unwittingly had advertisements placed next to YouTube videos of prepubescent girls viewed millions of times. Some of those viewers will be other young girls but a number will be paedophiles. Every time these videos are viewed, YouTube makes money. Every time it is alerted to the fact that it is profiting from the inappropriate and potentially criminal misuse of its content, it claims to take it down. Even if this were true it would be an unacceptably passive approach to a problem that demands far more proactive monitoring and enforcement. As it is, the company is glossing over what is really happening.

Investigations by The Times have found hundreds of videos identified as inappropriate by YouTube’s own “flaggers,” but still left online. As one of these flaggers said, “only when journalists start asking does YouTube do something.” The pattern is now familiar. In June we reported that multiple household name advertisers and all three main British political parties had paid for advertisements positioned by YouTube’s algorithms next to jihadist videos, including some showing extreme violence. When scores of advertisers temporarily withdrew their business, YouTube and its parent company, Google, promised to do better. Five months later we showed that paedophiles were gravitating to YouTube channels, including one called Toy Freaks, in which children supposedly having fun were in fact shown in pain or distress. The company’s claim to have acted promptly in taking the channel down did not explain why it had been allowed to remain active for six years, gaining seven billion views, despite having been flagged many times.

YouTube has made much of the role of its flaggers in recent months but they are volunteers with no formal training, and the company appears to have only a few dozen of them worldwide. According to one estimate there may be as few as three who are focused on material viewed by paedophiles. The volunteers are not being taken seriously enough. The paedophiles are online predators. Many congregate under assumed names in YouTube’s comments sections to swap videos, request more explicit material from children they have viewed and inveigle them into responding to private messages. They operate with impunity, and so does YouTube. Its business has grown so fast for so long that it appears to think it need pay only token attention to the protection and privacy of young and sometimes vulnerable people who provide it with content.

In the US this and other social media companies argue that so-called safe harbour rules absolve them of much of their responsibility as publishers. They cannot hide behind such smallprint for ever. They have a clear moral duty to do more. Specifically, YouTube should hire more paid staff to monitor its content, and it needs to adjust its software to recognise known paedophiles’ code words. Appeals such as this to Big Tech’s sense of morality have often fallen on deaf ears, but legal and commercial imperatives are less easy to tune out. To protect their brands, Twitter and Facebook have moved fast against Holocaust deniers and state-backed Russian advertisers. YouTube should take note. It is not immune.

Last month Alphabet, the holding company for both Google and YouTube, reported near-record revenues of $27.7 billion for the third quarter of this year. Results from Amazon and Microsoft came out at the same time and were nearly as impressive. In the next hour’s trading these three behemoths added $80 billion in combined market value, equivalent to half the GDP of Kazakhstan. Money is power. Alphabet has yet to show it understands that with power comes responsibility.

Credit: The Editor, The Times, 24 November 2017.