YouTube update on progress fighting terrorist and violent extremist content online

Machine learning is helping create systems that can remove extremist content quickly and (importantly) accurately.

* "Over 75 percent of the videos we've removed for violent extremism over the past month were taken down before receiving a single human flag."

* " ... in many cases our systems have proven more accurate than humans at flagging videos that need to be removed."

* "... over the past month, our initial use of machine learning has more than doubled both the number of videos we've removed for violent extremism, as well as the rate at which we’ve taken this kind of content down."

* YouTube is "working with more than 15 additional expert NGOs and institutions through our Trusted Flagger program, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue."

And videos that are flagged as hate speech or violent extremism, but don't actually meet the criteria for removal will be placed in a "limited state".

The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter.

Learn more on the YouTube blog:
Shared publiclyView activity