YouTube has disclosed it has removed over 8 million videos in the last three months of 2017.
Publishing the first quarterly report following the release of its Community Guidelines, the streaming site said the majority were either spam or people attempting to upload adult content.
The Google-owned site says they “represent a fraction of a percent of YouTube’s total views during this time period”.
The published figures do not include content removed for breach of copyright.
6.7 million videos were first flagged for review by machines rather than humans with 76% being taken down before they received a single view.
“At the beginning of 2017, 8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views.3 We introduced machine learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than 10 views,” YouTube said in a blog post.
A Reporting History dashboard has been introduced so users can individually access to see the status of videos they’ve flagged to us for review against our Community Guidelines.
By the end of 2018, Google plans to have 10,000 employees focussed on removing unacceptable content.
In order to help the proces, YouTube is asking creators themselves to help identify videos that aren’t brand-safe, writes Digiday. YouTube gave them a questionnaire to fill out when uploading a video that declares what inappropriate content the video may include. YouTube tested the self-certification questionnaire with 15 creators, but said it plans to expand the test to a few hundred more creators soon.