In December we shared how we’re expanding our work to remove content that violates our policies. Today, we’re providing an update and giving you additional insight into our work, including the release of the first YouTube Community Guidelines Enforcement Report.
Providing More Information
We are taking an important first step by releasing a quarterly report on how we’re enforcing our Community Guidelines. This regular update will help show the progress we’re making in removing violative content from our platform. By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed of removal, and policy removal reasons.
We’re also introducing a Reporting History dashboard that each YouTube user can individually access to see the status of videos they’ve flagged to us for review against our Community Guidelines.
Machines Helping to Address Violative Content
Machines are allowing us to flag content for review at scale, helping us remove millions of violative videos before they are ever viewed. And our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).
Highlights from the report — reflecting data from October – December 2017 — show:
- We removed over 8 million videos from YouTube during these months.1The majority of these 8 million videos were mostly spam or people attempting to upload adult content – and represent a fraction of a percent of YouTube’s total views during this time period.2
- 6.7 million were first flagged for review by machines rather than humans
- Of those 6.7 million videos, 76 percent were removed before they received a single view.
For example, at the beginning of 2017, 8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views.3 We introduced machine learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than 10 views.