- Better detection and faster removal powered by machine learning;
- More expert partners to help identify violative content;
- Tougher standards for videos that are controversial but do not violate our policies; and
- Amplified voices speaking out against hate and extremism.
Better detection and faster removal
We’ve always used a mix of human flagging and human review together with technology to address controversial content on YouTube. In June, we introduced machine learning to flag violent extremism content and escalate it for human review. We continue to get faster here:
- Over 83 percent of the videos we removed for violent extremism in the last month were taken down before receiving a single human flag, up 8 percentage points since August.
- Our teams have manually reviewed over a million videos to improve this flagging technology by providing large volumes of training examples.
Outside experts are essential to advising us on our policies and flagging content for additional inputs that better train our systems. Our partner NGOs bring expert knowledge of complex issues like hate speech, radicalization, and terrorism.
We have added 35 NGOs to our Trusted Flagger program, which is 70 percent of the way towards our goal. These new partner NGOs represent 20 different countries and include NGOs like the International Center for the Study of Radicalization at King’s College London and The Wahid Institute in Indonesia, which is dedicated to promoting religious freedom and tolerance.
We started applying tougher treatment to videos that aren’t illegal and don’t violate our Guidelines, but contain controversial religious or supremacist content. These videos remain on YouTube, but they are behind a warning interstitial, aren’t recommended, monetized, and don’t have key features including comments, suggested videos, and likes. This is working as intended and helping us strike a balance between upholding free expression, by providing a historical record of content in the public interest, while also keeping these videos from being widely spread or recommended to others.
Amplify voices speaking out against hate and extremism
We continue to support programs that counter extremist messages. We are researching expansion for Jigsaw's Redirect Method to apply this model to new languages and search terms. We’re heavily investing in our YouTube Creators for Change program to support Creators who are using YouTube to tackle social issues and promote awareness, tolerance and empathy. Every month these Creators release exciting and engaging new videos and campaigns to counter hate and social divisiveness:
- In September, three of our fellows, from Australia, the U.K., and the U.S., debuted their videos on the big screen at the Tribeca TV festival, tackling topics like racism, xenophobia, and experiences of first generation immigrants.
- Local YouTube Creators in Indonesia partnered with the MAARIF Institute and YouTube Creators for Change Ambassador, Cameo Project, to visit ten different cities and train thousands of high school students on promoting tolerance and speaking out against hate speech and extremism.
- We’re adding two new local Creators for Change chapters, in Israel and Spain, to the network of chapters around the world.
Terrorist and violent extremist material should not be spread online. We will continue to heavily invest to fight the spread of this content, provide updates to governments, and collaborate with other companies through the Global Internet Forum to Counter Terrorism. There remains more to do so we look forward to continuing to share our progress with you.
The YouTube Team