In its first report on the enforcement of community guidelines, YouTube claims to have removed 8,284,039 videos for violent or extremist content between October and December 2017. Of the more than 8 million videos removed, 6.7 million have was initially reported by a machine against a human – and, according to YouTube, 76% of the 6.7 million videos removed were removed before receiving only one view.
"Machines allow us to post content for full-scale review, helping us remove millions of violent videos before they're seen," the team writes YouTube on the site's official blog. withdrawals are paying off in high-risk, low-volume areas (such as violent extremism) and in high-volume areas (such as spam). "
YouTube also reported that more than 9 million videos have been reported by human reviewers for various content violation rules, from spam or misleading content to videos with hateful or abusive content or promoting terrorism.
According to the report, the marked content remains on the site as long as it does not violate the community's rules. In addition to offering statistics on tagged videos, YouTube has created a "Report History Dashboard", in which users can see the status of all videos that they have reported for inappropriate content.
In 2017, Google is committed to hiring 10,000 employees by the end of the year to help identify violent content uploaded to YouTube. At this point, the company says that the majority of additional roles needed to "reach our contribution to achieving this goal" have been met, but it has not given a specific number of hires . He said he had recruited "full-time specialists in violent extremism, counter-terrorism and human rights".
But YouTube brand security issues are still a major concern. On April 20, CNN released an investigative report showing a number of well-known YouTube brand commercials – including Nissan, Under Armor, Amazon, Hershey, Netflix and Hilton – accompanied by extremist content.
Nissan and Under Armor said that they were pausing their campaigns on YouTube after being informed that their ads had appeared in extremist video content, and Hilton said it was suppressing site ads .
A YouTube spokesperson sent the following comment to Marketing Land in response to CNN's report:
We partnered with our advertisers to make significant changes to the way we approach YouTube monetization with tighter policies, better controls, and greater transparency. When we see that ads have been delivered in error for content that does not comply with our policies, we will delete them immediately. We know that even when the videos follow our guidelines, the videos are not suitable for all brands. But we are committed to working with our advertisers and doing things right.
The quarterly report released by YouTube today, the first of what it says will be a regular update, aims to give more transparency on the amount of content that the site reviews and removes. And even though more than 8 million removed videos are a significant number, it does not necessarily provide much insight, as we do not know how many videos in total have been uploaded to YouTube during the same time period.
Last year, YouTube was faced with a number of brand security issues, resulting in the boycott of the site by several brands. Since then, the company has devoted much of its efforts to regaining advertisers' confidence in the site, but CNN's latest report shows that YouTube's investments in controlling violent and extremist content are not enough.