Tag Archives: Public Policy

Working together to combat terrorists online

Editor’s note: This is a revised and abbreviated version of a speech Kent delivered today at the United Nations in New York City, NY, on behalf of the members of the Global Internet Forum to Counter Terrorism.

The Global Internet Forum to Counter Terrorism is a group of four technology companies—Facebook, Microsoft, Twitter, and YouTube—that are committed to working together and with governments and civil society to address the problem of online terrorist content.

For our companies, terrorism isn’t just a business concern or a technical challenge. These are deeply personal threats. We are citizens of London, Paris, Jakarta, and New York. And in the wake of each terrorist attack we too frantically check in on our families and co-workers to make sure they are safe. We’ve all had to do this far too often.

The products that our companies build lower barriers to innovation and empower billions of people around the world. But we recognize that the internet and other tools have also been abused by terrorists in their efforts to recruit, fundraise, and organize. And we are committed to doing everything in our power to ensure that our platforms aren't used to distribute terrorist material.

The Forum’s efforts are focused on three areas: leveraging technology, conducting research on patterns of radicalization and misuse of online platforms, and sharing best practices to accelerate our joint efforts against dangerous radicalization. Let me say more about each pillar.

First, when it comes to technology, you should know that our companies are putting our best talent and technology against the task of getting terrorist content off our services. There is no silver bullet when it comes to finding and removing this content, but we’re getting much better.

One early success in collaboration has been our “hash sharing” database, which allows a company that discovers terrorist content on one of their sites to create a digital fingerprint and share it with the other companies in the coalition, who can then more easily detect and review similar content for removal.  

We have to deal with these problems at tremendous scale. The haystacks are unimaginably large and the needles are both very small and constantly changing. People upload over 400 hours of content to YouTube every minute. Our software engineers have spent years developing technology that can spot certain telltale cues and markers. In recent months we have more than doubled the number of videos we've removed for violent extremism and have located these videos twice as fast. And what’s more, 75 percent of the violent extremism videos we’ve removed in recent months were found using technology before they received a single human flag.

These efforts are working. Between August 2015 and June 2017, Twitter suspended more than 935,000 accounts for the promotion of terrorism. During the first half of 2017, over 95 percent of the accounts it removed were detected using its in-house technology. Facebook is using new advances in artificial intelligence to root out "terrorist clusters" by mapping out the pages, posts, and profiles with terrorist material and then shutting them down.

Despite this recent progress, machines are simply not at the stage where they can replace human judgment. For example, portions of a terrorist video in a news broadcast might be entirely legitimate, but a computer program will have difficulty distinguishing documentary coverage from incitement.  

The Forum’s second pillar is focused on conducting and sharing research about how terrorists use the internet to influence their audiences so that we can stay one step ahead.

Today, the members of the Forum are pleased to announce that we are making a multi-million dollar commitment to support research on terrorist abuse of the internet and how governments, tech companies, and civil society can fight back against online radicalization.

The Forum has also set a goal of working with 50 smaller tech companies to help them better tackle terrorist content on their platforms. On Monday, we hosted dozens of companies for a workshop with our partners under the UN Counter Terrorism Executive Directorate. There will be a workshop in Brussels in December and another in Indonesia in the coming months. And we are also working to expand the hash-sharing database to smaller companies.

The Forum’s final pillar is working together to find powerful messages and avenues to reach out to those at greatest risk of radicalization.

Members of the forum are doing a better job of sharing breakthroughs with each other. One success we’ve seen is with the Redirect Method developed at Alphabet’s Jigsaw group. Redirect uses targeted advertising to reach people searching for terrorist content and presents videos that undermine extremist recruiting efforts. During a recent eight-week study more than 300,000 users clicked on our targeted ads and watched more than 500,000 minutes of video. This past April, Microsoft started a similar program on Bing. And Jigsaw and Bing are now exploring a partnership to share best practices and expertise.

At the same time, we’re elevating the voices that are most credible in speaking out against terrorism, hate, and violence. YouTube’s Creators for Change program highlights online stars taking a stand against xenophobia and extremism.  And Facebook's P2P program has brought together more than 5,000 students from 68 countries to create campaigns to combat hate speech. And together the companies have participated in hundreds of meetings and trainings to counter violent extremism including events in Beirut, Bosnia, and Brussels and summits at the White House, here at the United Nations, London, and Sydney to empower credible non-governmental voices against violent extremism.

There is no magic computer program that will eliminate online terrorist content, but we are committed to working with everyone in this room as we continue to ramp up our own efforts to stop terrorists’ abuse of our services. This forum is an important step in the right direction. We look forward to working with national and local governments, and civil society, to prevent extremist ideology from spreading in communities and online.

Working together to combat terrorists online

Editor’s note: This is a revised and abbreviated version of a speech Kent delivered today at the United Nations in New York City, NY, on behalf of the members of the Global Internet Forum to Counter Terrorism.

The Global Internet Forum to Counter Terrorism is a group of four technology companies—Facebook, Microsoft, Twitter, and YouTube—that are committed to working together and with governments and civil society to address the problem of online terrorist content.

For our companies, terrorism isn’t just a business concern or a technical challenge. These are deeply personal threats. We are citizens of London, Paris, Jakarta, and New York. And in the wake of each terrorist attack we too frantically check in on our families and co-workers to make sure they are safe. We’ve all had to do this far too often.

The products that our companies build lower barriers to innovation and empower billions of people around the world. But we recognize that the internet and other tools have also been abused by terrorists in their efforts to recruit, fundraise, and organize. And we are committed to doing everything in our power to ensure that our platforms aren't used to distribute terrorist material.

The Forum’s efforts are focused on three areas: leveraging technology, conducting research on patterns of radicalization and misuse of online platforms, and sharing best practices to accelerate our joint efforts against dangerous radicalization. Let me say more about each pillar.

First, when it comes to technology, you should know that our companies are putting our best talent and technology against the task of getting terrorist content off our services. There is no silver bullet when it comes to finding and removing this content, but we’re getting much better.

One early success in collaboration has been our “hash sharing” database, which allows a company that discovers terrorist content on one of their sites to create a digital fingerprint and share it with the other companies in the coalition, who can then more easily detect and review similar content for removal.  

We have to deal with these problems at tremendous scale. The haystacks are unimaginably large and the needles are both very small and constantly changing. People upload over 400 hours of content to YouTube every minute. Our software engineers have spent years developing technology that can spot certain telltale cues and markers. In recent months we have more than doubled the number of videos we've removed for violent extremism and have located these videos twice as fast. And what’s more, 75 percent of the violent extremism videos we’ve removed in recent months were found using technology before they received a single human flag.

These efforts are working. Between August 2015 and June 2017, Twitter suspended more than 935,000 accounts for the promotion of terrorism. During the first half of 2017, over 95 percent of the accounts it removed were detected using its in-house technology. Facebook is using new advances in artificial intelligence to root out "terrorist clusters" by mapping out the pages, posts, and profiles with terrorist material and then shutting them down.

Despite this recent progress, machines are simply not at the stage where they can replace human judgment. For example, portions of a terrorist video in a news broadcast might be entirely legitimate, but a computer program will have difficulty distinguishing documentary coverage from incitement.  

The Forum’s second pillar is focused on conducting and sharing research about how terrorists use the internet to influence their audiences so that we can stay one step ahead.

Today, the members of the Forum are pleased to announce that we are making a multi-million dollar commitment to support research on terrorist abuse of the internet and how governments, tech companies, and civil society can fight back against online radicalization.

The Forum has also set a goal of working with 50 smaller tech companies to help them better tackle terrorist content on their platforms. On Monday, we hosted dozens of companies for a workshop with our partners under the UN Counter Terrorism Executive Directorate. There will be a workshop in Brussels in December and another in Indonesia in the coming months. And we are also working to expand the hash-sharing database to smaller companies.

The Forum’s final pillar is working together to find powerful messages and avenues to reach out to those at greatest risk of radicalization.

Members of the forum are doing a better job of sharing breakthroughs with each other. One success we’ve seen is with the Redirect Method developed at Alphabet’s Jigsaw group. Redirect uses targeted advertising to reach people searching for terrorist content and presents videos that undermine extremist recruiting efforts. During a recent eight-week study more than 300,000 users clicked on our targeted ads and watched more than 500,000 minutes of video. This past April, Microsoft started a similar program on Bing. And Jigsaw and Bing are now exploring a partnership to share best practices and expertise.

At the same time, we’re elevating the voices that are most credible in speaking out against terrorism, hate, and violence. YouTube’s Creators for Change program highlights online stars taking a stand against xenophobia and extremism.  And Facebook's P2P program has brought together more than 5,000 students from 68 countries to create campaigns to combat hate speech. And together the companies have participated in hundreds of meetings and trainings to counter violent extremism including events in Beirut, Bosnia, and Brussels and summits at the White House, here at the United Nations, London, and Sydney to empower credible non-governmental voices against violent extremism.

There is no magic computer program that will eliminate online terrorist content, but we are committed to working with everyone in this room as we continue to ramp up our own efforts to stop terrorists’ abuse of our services. This forum is an important step in the right direction. We look forward to working with national and local governments, and civil society, to prevent extremist ideology from spreading in communities and online.

Google’s fight against human trafficking

Google has made it a priority to tackle the heinous crime of sex trafficking. I know, because I’ve worked on this from the day I joined in 2012. We have hired and funded advocates in this area. We have developed and built extensive technology to connect victims with the resources they need. And we have helped pioneer the use of technologies that identify trafficking networks to make it easier and quicker for law enforcement to arrest these abusers. You can read about these efforts here. We’ve supported over 40 bills on human trafficking. And we are determined to do more to stop this evil, including support for tougher legislation. 

There is currently a debate over a proposed bill to combat sex trafficking by amending section 230 of the Communications Decency Act. While we agree with the intentions of the bill, we are concerned that it erodes the “good samaritan” protection and would actually hinder the fight against sex trafficking. While large companies are more likely to continue their proactive enforcement efforts and can afford to fight lawsuits, if smaller platforms are made liable for “knowledge” of human trafficking occurring on their platforms, there is a risk that some will seek to avoid that “knowledge”; they will simply stop looking for it. This would be a disaster. We think it’s much better to foster an environment in which all technology companies can continue to clean their platforms and support effective tools for law enforcement and advocacy organizations to find and disrupt these networks. We’ve met with the sponsors of the particular bill and provided alternatives that will encourage this environment, and we’ll continue to seek a constructive approach to advance a shared goal.

We’re not alone in this view. Organizations as broad and diverse as Engine Advocacy, PEN America, Charles Koch Institute, Heritage Action, ACLU, U.S. Chamber Technology Engagement Center, Business Software Alliance, Internet Commerce Coalition, Internet Association (whose members include Microsoft, Twitter, Facebook, Amazon, Snap, Match.com, Pinterest, etc.), TechFreedom, Medium, GitHub, Reddit, Wikimedia, National Venture Capital Association and many others have raised concerns about the bill. We—and many others—stand ready to work with Congress on changes to the bill, and on other legislation and measures to fight human trafficking and protect and support victims and survivors.

A lot of the discussion around this issue focuses on the role of a website called Backpage.com. I want to make our position on this clear. Google believes that Backpage.com can and should be held accountable for its crimes. We strongly applaud the work of the Senate Permanent Subcommittee on Investigations in exposing Backpage's intentional promotion of child sex trafficking through ads. Based on those findings, Google believes that Backpage.com should be criminally prosecuted by the US Department of Justice for facilitating child sex trafficking, something they can do today without need to amend any laws. And years before the Senate's investigation and report, we prohibited Backpage from advertising on Google, and we have criticized Backpage publicly.

I understand that when important legislation is being discussed, public debate is robust. That’s how it should be. But on the crucial issue of sex trafficking, we’ve been a deeply committed partner in the fight. Let’s not let a genuine disagreement over the likely unintended impact of a particular piece of legislation obscure that fact.

Google’s fight against human trafficking

Google has made it a priority to tackle the heinous crime of sex trafficking. I know, because I’ve worked on this from the day I joined in 2012. We have hired and funded advocates in this area. We have developed and built extensive technology to connect victims with the resources they need. And we have helped pioneer the use of technologies that identify trafficking networks to make it easier and quicker for law enforcement to arrest these abusers. You can read about these efforts here. We’ve supported over 40 bills on human trafficking. And we are determined to do more to stop this evil, including support for tougher legislation. 

There is currently a debate over a proposed bill to combat sex trafficking by amending section 230 of the Communications Decency Act. While we agree with the intentions of the bill, we are concerned that it erodes the “good samaritan” protection and would actually hinder the fight against sex trafficking. While large companies are more likely to continue their proactive enforcement efforts and can afford to fight lawsuits, if smaller platforms are made liable for “knowledge” of human trafficking occurring on their platforms, there is a risk that some will seek to avoid that “knowledge”; they will simply stop looking for it. This would be a disaster. We think it’s much better to foster an environment in which all technology companies can continue to clean their platforms and support effective tools for law enforcement and advocacy organizations to find and disrupt these networks. We’ve met with the sponsors of the particular bill and provided alternatives that will encourage this environment, and we’ll continue to seek a constructive approach to advance a shared goal.

We’re not alone in this view. Organizations as broad and diverse as Engine Advocacy, PEN America, Charles Koch Institute, Heritage Action, ACLU, U.S. Chamber Technology Engagement Center, Business Software Alliance, Internet Commerce Coalition, Internet Association (whose members include Microsoft, Twitter, Facebook, Amazon, Snap, Match.com, Pinterest, etc.), TechFreedom, Medium, GitHub, Reddit, Wikimedia, National Venture Capital Association and many others have raised concerns about the bill. We—and many others—stand ready to work with Congress on changes to the bill, and on other legislation and measures to fight human trafficking and protect and support victims and survivors.

A lot of the discussion around this issue focuses on the role of a website called Backpage.com. I want to make our position on this clear. Google believes that Backpage.com can and should be held accountable for its crimes. We strongly applaud the work of the Senate Permanent Subcommittee on Investigations in exposing Backpage's intentional promotion of child sex trafficking through ads. Based on those findings, Google believes that Backpage.com should be criminally prosecuted by the US Department of Justice for facilitating child sex trafficking, something they can do today without need to amend any laws. And years before the Senate's investigation and report, we prohibited Backpage from advertising on Google, and we have criticized Backpage publicly.

I understand that when important legislation is being discussed, public debate is robust. That’s how it should be. But on the crucial issue of sex trafficking, we’ve been a deeply committed partner in the fight. Let’s not let a genuine disagreement over the likely unintended impact of a particular piece of legislation obscure that fact.

A significant step toward modernizing our surveillance laws

Last month, our General Counsel Kent Walker delivered a speech calling for a fundamental realignment of government access statutes in light of the growing role that technology plays in our daily lives, the expectation that communications should remain private, and the very real security threats that governments need to investigate.

In conjunction with the speech, we proposed a new framework oriented toward policy solutions that recognize legitimate law enforcement interests, respect the sovereignty of other countries, and reflect the reasonable expectation of privacy that users have in the content of their electronic communications.

The introduction of the International Communications Privacy Act (ICPA) by Senators Hatch, Coons, and Heller advances these objectives, and we commend these Senators for their leadership in this area.

ICPA would update the Electronic Communications Privacy Act (ECPA) in two important ways.

First, it would require U.S. government entities to obtain a warrant to compel the production of communications content from providers.  For many years, we have called upon the U.S. Congress to update ECPA in this manner, and the House of Representative has twice passed legislation (the Email Privacy Act) that would achieve this goal.

Second, it provides clear mechanisms for the U.S. government to obtain user data from service providers with a warrant, wherever the data may be stored, but with protections built in for certain cases when the users are nationals of other countries and are located outside the U.S.

We are eager to work with Members of Congress enact ICPA into law, and look forward to the opportunity to help advance this important bill.

A Significant Step Toward Modernizing Our Surveillance Laws

Last month, our General Counsel Kent Walker delivered a speech calling for a fundamental realignment of government access statutes in light of the growing role that technology plays in our daily lives, the expectation that communications should remain private, and the very real security threats that governments need to investigate.

In conjunction with the speech, we proposed a new framework oriented toward policy solutions that recognize legitimate law enforcement interests, respect the sovereignty of other countries, and reflect the reasonable expectation of privacy that users have in the content of their electronic communications.

The introduction of the International Communications Privacy Act (ICPA) by Senators Hatch, Coons, and Heller advances these objectives, and we commend these Senators for their leadership in this area.

ICPA would update the Electronic Communications Privacy Act (ECPA) in two important ways.

First, it would require U.S. government entities to obtain a warrant to compel the production of communications content from providers.  For many years, we have called upon the U.S. Congress to update ECPA in this manner, and the House of Representative has twice passed legislation (the Email Privacy Act) that would achieve this goal.

Second, it provides clear mechanisms for the U.S. government to obtain user data from service providers with a warrant, wherever the data may be stored, but with protections built in for certain cases when the users are nationals of other countries and are located outside the U.S.

We are eager to work with Members of Congress enact ICPA into law, and look forward to the opportunity to help advance this important bill.

Applications now open for the Google Policy Fellowship in Europe and Africa

Are you an undergraduate, graduate or law student interested in internet and technology policy? Do you want to get involved in the public dialogue on these issues? If so, the new Google Policy Fellowship pilot programs in Italy, Belgium (Brussels), and three African countries may be for you.  

Successful applicants to the program will have the opportunity to work at public interest organizations at the forefront of debates on internet policy issues. They will be assigned a mentor at their host organizations and will have the opportunity to work with senior staff members.

Fellows will be expected to make substantive contributions to the work of their organization, including conducting policy research and analysis, drafting reports and white papers, attending government and industry meetings and conferences, and participating in other advocacy activities.

The work of the fellows is decided between the individuals and the organizations. Google provides a small stipend during the period of the fellowship, but has no involvement in defining or conducting the research. Typically, the fellows are postgraduates and they work with the organization on an area of research or study.

For example, in previous years, a fellow with the Strathmore Law School in Nairobi, Kenya, carried out a review of cyber-security conventions around the world, and a fellow at the Ghana-India Kofi Annan Centre of Excellence in ICT in Ghana helped to establish the Creative Commons chapter for Ghana before returning to university to finish her Ph.D. All work is carried out independently of Google.

Who should apply?

The organisations in the program are looking for students who are passionate about technology, and want to gain experience of working on public policy. Students from all majors and degree programs who possess the following qualities are encouraged to apply:

  • Demonstrated or stated interest in Internet and technology policy
  • Excellent academic record, professional/extracurricular/volunteer activities, subject matter expertise
  • First-rate analytical, communications, research, and writing skills
  • Ability to manage multiple projects simultaneously and efficiently, and to work smartly and resourcefully in a fast-paced environment

Brussels pilot

We are pleased to offer three fellowships, starting in September 2017, at the organizations listed below. These placements will run for six months and the stipend will vary slightly from organization to organization. To apply, please use the link below and send a short email, together with a CV. Deadline for applications is July 31, 2017.

Italy pilot

We’re pleased to offer six fellowships, starting in October 2017, and lasting up to six months, at the organizations listed below. To apply, please send a short email to the address below, together with a CV. Deadline for applications is August 27, 2017.

Africa program

We’re pleased to offer eight fellowships, starting from late August 2017, across Sub-Saharan Africa. The program will run for six to twelve months, with exact duration varying by organization. Detailed job descriptions can be viewed here. To apply, please complete the form at 2017 Africa Google Policy Fellowship Application. Deadline for applications is August 5, 2017. Beneath is a list of organization and locations for the fellowships.

Applications now open for the Google Policy Fellowship in Europe and Africa

Are you an undergraduate, graduate or law student interested in internet and technology policy? Do you want to get involved in the public dialogue on these issues? If so, the new Google Policy Fellowship pilot programs in Italy, Belgium (Brussels), and three African countries may be for you.  

Successful applicants to the program will have the opportunity to work at public interest organizations at the forefront of debates on internet policy issues. They will be assigned a mentor at their host organizations and will have the opportunity to work with senior staff members.

Fellows will be expected to make substantive contributions to the work of their organization, including conducting policy research and analysis, drafting reports and white papers, attending government and industry meetings and conferences, and participating in other advocacy activities.

The work of the fellows is decided between the individuals and the organizations. Google provides a small stipend during the period of the fellowship, but has no involvement in defining or conducting the research. Typically, the fellows are postgraduates and they work with the organization on an area of research or study.

For example, in previous years, a fellow with the Strathmore Law School in Nairobi, Kenya, carried out a review of cyber-security conventions around the world, and a fellow at the Ghana-India Kofi Annan Centre of Excellence in ICT in Ghana helped to establish the Creative Commons chapter for Ghana before returning to university to finish her Ph.D. All work is carried out independently of Google.

Who should apply?

The organisations in the program are looking for students who are passionate about technology, and want to gain experience of working on public policy. Students from all majors and degree programs who possess the following qualities are encouraged to apply:

  • Demonstrated or stated interest in Internet and technology policy
  • Excellent academic record, professional/extracurricular/volunteer activities, subject matter expertise
  • First-rate analytical, communications, research, and writing skills
  • Ability to manage multiple projects simultaneously and efficiently, and to work smartly and resourcefully in a fast-paced environment

Brussels pilot

We are pleased to offer three fellowships, starting in September 2017, at the organizations listed below. These placements will run for six months and the stipend will vary slightly from organization to organization. To apply, please use the link below and send a short email, together with a CV. Deadline for applications is July 31, 2017.

Italy pilot

We’re pleased to offer six fellowships, starting in October 2017, and lasting up to six months, at the organizations listed below. To apply, please send a short email to the address below, together with a CV. Deadline for applications is August 27, 2017.

Africa program

We’re pleased to offer eight fellowships, starting from late August 2017, across Sub-Saharan Africa. The program will run for six to twelve months, with exact duration varying by organization. Detailed job descriptions can be viewed here. To apply, please complete the form at 2017 Africa Google Policy Fellowship Application. Deadline for applications is August 5, 2017. Beneath is a list of organization and locations for the fellowships.

Applications now open for the Google Policy Fellowship in Europe and Africa

Are you an undergraduate, graduate or law student interested in internet and technology policy? Do you want to get involved in the public dialogue on these issues? If so, the new Google Policy Fellowship pilot programs in Italy, Belgium (Brussels), and three African countries may be for you.  

Successful applicants to the program will have the opportunity to work at public interest organizations at the forefront of debates on internet policy issues. They will be assigned a mentor at their host organizations and will have the opportunity to work with senior staff members.

Fellows will be expected to make substantive contributions to the work of their organization, including conducting policy research and analysis, drafting reports and white papers, attending government and industry meetings and conferences, and participating in other advocacy activities.

The work of the fellows is decided between the individuals and the organizations. Google provides a small stipend during the period of the fellowship, but has no involvement in defining or conducting the research. Typically, the fellows are postgraduates and they work with the organization on an area of research or study.

For example, in previous years, a fellow with the Strathmore Law School in Nairobi, Kenya, carried out a review of cyber-security conventions around the world, and a fellow at the Ghana-India Kofi Annan Centre of Excellence in ICT in Ghana helped to establish the Creative Commons chapter for Ghana before returning to university to finish her Ph.D. All work is carried out independently of Google.

Who should apply?

The organisations in the program are looking for students who are passionate about technology, and want to gain experience of working on public policy. Students from all majors and degree programs who possess the following qualities are encouraged to apply:

  • Demonstrated or stated interest in Internet and technology policy
  • Excellent academic record, professional/extracurricular/volunteer activities, subject matter expertise
  • First-rate analytical, communications, research, and writing skills
  • Ability to manage multiple projects simultaneously and efficiently, and to work smartly and resourcefully in a fast-paced environment

Brussels pilot

We are pleased to offer three fellowships, starting in September 2017, at the organizations listed below. These placements will run for six months and the stipend will vary slightly from organization to organization. To apply, please use the link below and send a short email, together with a CV. Deadline for applications is July 31, 2017.

Italy pilot

We’re pleased to offer six fellowships, starting in October 2017, and lasting up to six months, at the organizations listed below. To apply, please send a short email to the address below, together with a CV. Deadline for applications is August 27, 2017.

Africa program

We’re pleased to offer eight fellowships, starting from late August 2017, across Sub-Saharan Africa. The program will run for six to twelve months, with exact duration varying by organization. Detailed job descriptions can be viewed here. To apply, please complete the form at 2017 Africa Google Policy Fellowship Application. Deadline for applications is August 5, 2017. Beneath is a list of organization and locations for the fellowships.

A new look for our Transparency Report

In 2010, we launched the government requests tool, a new way to publicly document government requests for user data and content removals. It was the first report of its kind and a natural extension of our mission to make information accessible and useful. In the years since, our simple tool evolved into the Transparency Report, a multifaceted snapshot of the ways governments and corporations affect online security, privacy, and the free flow of information.

The web has evolved too, and has become central to people’s lives: 400 hours of video are uploaded to YouTube every minute, more than one billion people rely on Gmail and Chrome, every day. And this type reporting, once an anomaly, has become the norm across the tech industry and beyond. More than 40 companies now have transparency reports; that’s great news for people everywhere.

But while the report itself expanded in scope and coverage, its design remained largely unchanged. Not only was it due for a little update, we heard from users it could be easier to navigate as well.

So today we’re introducing the completely revamped Transparency Report. It features clearer data visualizations, more context for the data, a Recent Updates section so you can see what’s new, and a better way to download data from our most popular reports. And while the previous version was a patchwork of different reports, designed at different times in different styles, our new report is all one consistent design, making it easier to find exactly what you’re looking for.

We’re continuing to invest in this report because we’ve seen firsthand how it can help inform and shape the public debate about information online. The data also acts as a lens into significant moments in the history of the web, fundamental changes to security, and our efforts to be transparent about data and how it is used. Here are a few examples:

Our Traffic and Disruptions report documents real-time disruptions to usage of our products. Here’s what the report looked like for search in Egypt in January of 2011 when internet access was restricted during the Arab Spring.

Traffic

As we say in the report, “when you send or receive emails from a provider that doesn’t encrypt messages in transit, they are as open to snoopers as a postcard in the mail.” In 2014, we started reporting on the state of email encryption across the industry and which providers offer this protection. It’s been really encouraging to see how these trends have changed. Since then, outbound email encryption has gone from 73 percent to 88 percent., and inbound email encryption has gone from 61 percent to 88 percent since the launch of the report as well. We hope these numbers continue to increase in the years ahead.

Encryption

And going back to the report’s original mission—government requests—we’re constantly pushing for more complete and accurate data. The results of this effort are visible directly in the report. In December 2016, for example, after a years-long effort, we were able to share National Security Letters with the public, for the first time.

Over the years, the Transparency Report has sparked new conversations about transparency, accountability and the role of governments and companies in the flow of information online. Our hope is that with these changes, we can start a few more.