Category Archives: Public Policy Blog

Google’s views on government, policy and politics

Defending access to lawful information at Europe’s highest court

Under the right to be forgotten, Europeans can ask for information about themselves to be removed from search results for their name if it is outdated, or irrelevant. From the outset, we have publicly stated our concerns about the ruling, but we have still worked hard to comply—and to do so conscientiously and in consultation with Data Protection Authorities. To date, we’ve handled requests to delist nearly 2 million search results in Europe, removing more than 800,000 of them. We have also taken great care not to erase results that are clearly in the public interest, as the European Court of Justice directed. Most Data Protection Authorities have concluded that this approach strikes the right balance.


But two right to be forgotten cases now in front of the European Court of Justice threaten that balance.


In the first case, four individuals—who we can’t name—present an apparently simple argument: European law protects sensitive personal data; sensitive personal data includes information about your political beliefs or your criminal record; so all mentions of criminality or political affiliation should automatically be purged from search results, without any consideration of public interest.


If the Court accepted this argument, it would give carte blanche to people who might wish to use privacy laws to hide information of public interest—like a politician’s political views, or a public figure’s criminal record. This would effectively erase the public’s right to know important information about people who represent them in society or provide them services.


In the second case, the Court must decide whether Google should enforce the right to be forgotten not just in Europe, but in every country around the world. We—and a wide range of human rights and media organizations, and others, like Wikimedia—believe that this runs contrary to the basic principles of international law: no one country should be able to impose its rules on the citizens of another country, especially when it comes to linking to lawful content. Adopting such a rule would encourage other countries, including less democratic regimes, to try to impose their values on citizens in the rest of the world.


We’re speaking out because restricting access to lawful and valuable information is contrary to our mission as a company and keeps us from delivering the comprehensive search service that people expect of us.


But the threat is much greater than this. These cases represent a serious assault on the public’s right to access lawful information.


We will argue in court for a reasonable interpretation of the right to be forgotten and for the ability of countries around the world to set their own laws, not have those of others imposed on them. Up to November 20, European countries and institutions have the chance to make their views known to the Court. And we encourage everyone who cares about public access to information to stand up and fight to preserve it.

Security and disinformation in the U.S. 2016 election

We’ve seen many types of efforts to abuse Google’s services over the years. And, like other internet platforms, we have found some evidence of efforts to misuse our platforms during the 2016 U.S. election by actors linked to the Internet Research Agency in Russia. 

Preventing the misuse of our platforms is something that we take very seriously; it’s a major focus for our teams. We’re committed to finding a way to stop this type of abuse, and to working closely with governments, law enforcement, other companies, and leading NGOs to promote electoral integrity and user security, and combat misinformation. 

We have been conducting a thorough investigation related to the U.S. election across our products drawing on the work of our information security team, research into misinformation campaigns from our teams, and leads provided by other companies. Today, we are sharing results from that investigation. While we have found only limited activity on our services, we will continue to work to prevent all of it, because there is no amount of interference that is acceptable.

We will be launching several new initiatives to provide more transparency and enhance security, which we also detail in these information sheets: what we found, steps against phishing and hacking, and our work going forward.

Our work doesn’t stop here, and we’ll continue to investigate as new information comes to light. Improving transparency is a good start, but we must also address new and evolving threat vectors for misinformation and attacks on future elections. We will continue to do our best to help people find valuable and useful information, an essential foundation for an informed citizenry and a robust democratic process.

Towards a future of work that works for everyone

The future of work concerns us all. Our grandchildren will have jobs that don’t yet exist, and will live lives we cannot imagine. In Europe, getting the future of work right for individuals, societies and industries means having an open debate about the possibilities right now. We want to be a part of that discussion, and help contribute to a future of work that works for everyone. So last week in Stockholm and The Hague we brought together a range of leading international experts from academia, trade unions, public sector and businesses to discuss the impact of technology on jobs. We also asked McKinsey for a report on the impact of automation on work, jobs and skills.


As advances in machine learning and robotics make headlines, there’s a heated debate about whether innovation is a magic fix for an aging workforce, or a fast track to mass unemployment. Data can illuminate that debate, and McKinsey focused their research on the Nordics, Benelux, Ireland and Estonia—a diverse group which have at least one thing in common: They’re Europe’s digital frontrunners. The report from McKinsey shows us that while automation will impact existing jobs, innovation and adopting new technology can increase the total number of jobs available.


The report makes it very clear that divergent paths are possible. To make a success of the digital transition, countries should promote adoption of new technologies and double down on skills training and education. We want to play our part here. One example of how we contribute is our program Digitalakademin in Sweden: So far, we’ve trained more than 20,000 people in small- and medium-sized business in digital skills. And together with the Swedish National Employment Agency we’ve developed training to help unemployed people get the skills necessary for the jobs of the future.

As Erik Sandström from the National Employment Agency stressed at our event in Stockholm, it “all starts with digital competence—if you’re lacking in digital competence you will overestimate the risks and underestimate the opportunities.” That sentiment was echoed in a keynote by Ylva Johansson, the Swedish Minister for Employment and Integration: “Why do we have an attitude where unions, employees are positively accepting ongoing changes? Because we’ve been able to protect people and to present new opportunities through reskilling.”

For our event in The Hague we partnered with Dutch company Randstad to discuss the same topic of future of work. Their CEO, Jacques van den Broek, struck an optimistic tone: “The digital transformation is an opportunity, not a threat,” he said. “The lesson we’ve learned is that whilst some jobs disappear, tech creates jobs. The longer you wait to embrace that change, the longer it takes to be able to compete.”


The coming changes will likely affect a wide range of tasks and jobs. “In Denmark, we discussed the destruction of jobs,” Thomas Søby from the Danish Steelworkers Union said. “New ones are created,” he added. “But some people will lose their jobs and feel left behind, and as a society we need to take care of those people.”


Those new jobs aren’t simply replacements—they’re roles we don’t have yet. “In a few years something else will be hot,” said Aart-Jan de Geus of Bertelsmann Stiftung, a German private foundation which looks at managing future challenges. He stressed that fears about job losses shouldn’t be overstated, especially as consumer demand and spending won’t go away. “The big mistake would be to try to protect jobs; we need to protect workers.”


In The Hague, Eric Schmidt, Alphabet’s executive chairman, ended on a positive note, saying that anxiety about change was understandable but that society can make sure the digital transition includes everyone. “Incumbents resist change. This is not new and in fact we have seen it throughout every stage of history,” he said. “But if history has taught us anything, it is that when disruptors and pioneers are right, society always recalibrates.”

Updating our Transparency Report and electronic privacy laws

Today, we are releasing the latest version of our Transparency Report concerning government requests for user data. This includes government requests for user data in criminal cases, as well as national security matters under U.S. law. Google fought for the right to publish this information in court and before Congress, and we continue to believe that this type of transparency can inform the broader debate about the nature and scope of government surveillance laws and programs.


In the first half of 2017, worldwide, we received 48,941 government requests that relate to 83,345 accounts. You can see more detailed figures, including a country-by-country breakdown of requests, here. We’ve also posted updated figures for the number of users/accounts impacted by Foreign Intelligence Surveillance Act (FISA) requests for content in previous reporting periods. While the total number of FISA content requests was reported accurately, we inadvertently under-reported the user/account figures in some reporting periods and over-reported the user/account figures in the second half of 2010. The corrected figures are in the latest report and reflected on our visible changes page.


Updating Electronic Privacy Laws


We are publishing the latest update to our Transparency Report as the U.S. Congress embarks upon an important debate concerning the nature and scope of key FISA provisions. Section 702 of the FISA Amendments Act of 2008 expires at the end of 2017. This is the section of FISA that authorizes the U.S. government to compel service providers like Google to disclose user data (including communications content) about non-U.S. persons in order to acquire “foreign intelligence information.”


Earlier this year, we expressed support for specific reforms to Section 702. We continue to believe that Congress can enact reforms to Section 702 in a way that enhances privacy protection for internet users while protecting national security. Independent bodies have concluded that Section 702 is valuable and effective in protecting national security and producing useful foreign intelligence. These assessments, however, do not preclude reforms that improve privacy protections for U.S. and non-U.S. persons and that do not disturb the core purposes of Section 702.


Government access laws are due for a fundamental realignment and update in light of the proliferation of technology, the very real security threats to people, and the expectations of privacy that Internet users have in their communications. Our General Counsel, Kent Walker, delivered a speech earlier this year calling for a new framework to address cross-border law enforcement requests. Updates to the Electronic Communications Privacy Act (ECPA) will be necessary to create a legal framework that addresses both law enforcement and civil liberties concerns.


The recent introduction of the International Communications Privacy Act (ICPA) in the Senate and the House is a significant step in the right direction, and we applaud Senators Hatch, Coons, and Heller and Representatives Collins, Jeffries, Issa, and DeBene for their leadership on this important bill. ECPA should also be updated to enable countries that commit to baseline privacy, due process, and human rights principles to make direct requests to U.S. providers. Providing a pathway for such countries to obtain electronic evidence directly from service providers in other jurisdictions will remove incentives for the unilateral, extraterritorial assertion of a country’s laws, data localization proposals, aggressive expansion of government access authorities, and dangerous investigative techniques. These measures ultimately weaken privacy, due process, and human rights standards.


We look forward to continuing in the constructive discussion about these issues.


Working together to combat terrorists online

Editor’s note: This is a revised and abbreviated version of a speech Kent delivered today at the United Nations in New York City, NY, on behalf of the members of the Global Internet Forum to Counter Terrorism.

The Global Internet Forum to Counter Terrorism is a group of four technology companies—Facebook, Microsoft, Twitter, and YouTube—that are committed to working together and with governments and civil society to address the problem of online terrorist content.

For our companies, terrorism isn’t just a business concern or a technical challenge. These are deeply personal threats. We are citizens of London, Paris, Jakarta, and New York. And in the wake of each terrorist attack we too frantically check in on our families and co-workers to make sure they are safe. We’ve all had to do this far too often.

The products that our companies build lower barriers to innovation and empower billions of people around the world. But we recognize that the internet and other tools have also been abused by terrorists in their efforts to recruit, fundraise, and organize. And we are committed to doing everything in our power to ensure that our platforms aren't used to distribute terrorist material.

The Forum’s efforts are focused on three areas: leveraging technology, conducting research on patterns of radicalization and misuse of online platforms, and sharing best practices to accelerate our joint efforts against dangerous radicalization. Let me say more about each pillar.

First, when it comes to technology, you should know that our companies are putting our best talent and technology against the task of getting terrorist content off our services. There is no silver bullet when it comes to finding and removing this content, but we’re getting much better.

One early success in collaboration has been our “hash sharing” database, which allows a company that discovers terrorist content on one of their sites to create a digital fingerprint and share it with the other companies in the coalition, who can then more easily detect and review similar content for removal.  

We have to deal with these problems at tremendous scale. The haystacks are unimaginably large and the needles are both very small and constantly changing. People upload over 400 hours of content to YouTube every minute. Our software engineers have spent years developing technology that can spot certain telltale cues and markers. In recent months we have more than doubled the number of videos we've removed for violent extremism and have located these videos twice as fast. And what’s more, 75 percent of the violent extremism videos we’ve removed in recent months were found using technology before they received a single human flag.

These efforts are working. Between August 2015 and June 2017, Twitter suspended more than 935,000 accounts for the promotion of terrorism. During the first half of 2017, over 95 percent of the accounts it removed were detected using its in-house technology. Facebook is using new advances in artificial intelligence to root out "terrorist clusters" by mapping out the pages, posts, and profiles with terrorist material and then shutting them down.

Despite this recent progress, machines are simply not at the stage where they can replace human judgment. For example, portions of a terrorist video in a news broadcast might be entirely legitimate, but a computer program will have difficulty distinguishing documentary coverage from incitement.  

The Forum’s second pillar is focused on conducting and sharing research about how terrorists use the internet to influence their audiences so that we can stay one step ahead.

Today, the members of the Forum are pleased to announce that we are making a multi-million dollar commitment to support research on terrorist abuse of the internet and how governments, tech companies, and civil society can fight back against online radicalization.

The Forum has also set a goal of working with 50 smaller tech companies to help them better tackle terrorist content on their platforms. On Monday, we hosted dozens of companies for a workshop with our partners under the UN Counter Terrorism Executive Directorate. There will be a workshop in Brussels in December and another in Indonesia in the coming months. And we are also working to expand the hash-sharing database to smaller companies.

The Forum’s final pillar is working together to find powerful messages and avenues to reach out to those at greatest risk of radicalization.

Members of the forum are doing a better job of sharing breakthroughs with each other. One success we’ve seen is with the Redirect Method developed at Alphabet’s Jigsaw group. Redirect uses targeted advertising to reach people searching for terrorist content and presents videos that undermine extremist recruiting efforts. During a recent eight-week study more than 300,000 users clicked on our targeted ads and watched more than 500,000 minutes of video. This past April, Microsoft started a similar program on Bing. And Jigsaw and Bing are now exploring a partnership to share best practices and expertise.

At the same time, we’re elevating the voices that are most credible in speaking out against terrorism, hate, and violence. YouTube’s Creators for Change program highlights online stars taking a stand against xenophobia and extremism.  And Facebook's P2P program has brought together more than 5,000 students from 68 countries to create campaigns to combat hate speech. And together the companies have participated in hundreds of meetings and trainings to counter violent extremism including events in Beirut, Bosnia, and Brussels and summits at the White House, here at the United Nations, London, and Sydney to empower credible non-governmental voices against violent extremism.

There is no magic computer program that will eliminate online terrorist content, but we are committed to working with everyone in this room as we continue to ramp up our own efforts to stop terrorists’ abuse of our services. This forum is an important step in the right direction. We look forward to working with national and local governments, and civil society, to prevent extremist ideology from spreading in communities and online.

Google’s fight against human trafficking

Google has made it a priority to tackle the heinous crime of sex trafficking. I know, because I’ve worked on this from the day I joined in 2012. We have hired and funded advocates in this area. We have developed and built extensive technology to connect victims with the resources they need. And we have helped pioneer the use of technologies that identify trafficking networks to make it easier and quicker for law enforcement to arrest these abusers. You can read about these efforts here. We’ve supported over 40 bills on human trafficking. And we are determined to do more to stop this evil, including support for tougher legislation. 

There is currently a debate over a proposed bill to combat sex trafficking by amending section 230 of the Communications Decency Act. While we agree with the intentions of the bill, we are concerned that it erodes the “good samaritan” protection and would actually hinder the fight against sex trafficking. While large companies are more likely to continue their proactive enforcement efforts and can afford to fight lawsuits, if smaller platforms are made liable for “knowledge” of human trafficking occurring on their platforms, there is a risk that some will seek to avoid that “knowledge”; they will simply stop looking for it. This would be a disaster. We think it’s much better to foster an environment in which all technology companies can continue to clean their platforms and support effective tools for law enforcement and advocacy organizations to find and disrupt these networks. We’ve met with the sponsors of the particular bill and provided alternatives that will encourage this environment, and we’ll continue to seek a constructive approach to advance a shared goal.

We’re not alone in this view. Organizations as broad and diverse as Engine Advocacy, PEN America, Charles Koch Institute, Heritage Action, ACLU, U.S. Chamber Technology Engagement Center, Business Software Alliance, Internet Commerce Coalition, Internet Association (whose members include Microsoft, Twitter, Facebook, Amazon, Snap, Match.com, Pinterest, etc.), TechFreedom, Medium, GitHub, Reddit, Wikimedia, National Venture Capital Association and many others have raised concerns about the bill. We—and many others—stand ready to work with Congress on changes to the bill, and on other legislation and measures to fight human trafficking and protect and support victims and survivors.

A lot of the discussion around this issue focuses on the role of a website called Backpage.com. I want to make our position on this clear. Google believes that Backpage.com can and should be held accountable for its crimes. We strongly applaud the work of the Senate Permanent Subcommittee on Investigations in exposing Backpage's intentional promotion of child sex trafficking through ads. Based on those findings, Google believes that Backpage.com should be criminally prosecuted by the US Department of Justice for facilitating child sex trafficking, something they can do today without need to amend any laws. And years before the Senate's investigation and report, we prohibited Backpage from advertising on Google, and we have criticized Backpage publicly.

I understand that when important legislation is being discussed, public debate is robust. That’s how it should be. But on the crucial issue of sex trafficking, we’ve been a deeply committed partner in the fight. Let’s not let a genuine disagreement over the likely unintended impact of a particular piece of legislation obscure that fact.

Google’s fight against human trafficking

Google has made it a priority to tackle the heinous crime of sex trafficking. I know, because I’ve worked on this from the day I joined in 2012. We have hired and funded advocates in this area. We have developed and built extensive technology to connect victims with the resources they need. And we have helped pioneer the use of technologies that identify trafficking networks to make it easier and quicker for law enforcement to arrest these abusers. You can read about these efforts here. We’ve supported over 40 bills on human trafficking. And we are determined to do more to stop this evil, including support for tougher legislation. 

There is currently a debate over a proposed bill to combat sex trafficking by amending section 230 of the Communications Decency Act. While we agree with the intentions of the bill, we are concerned that it erodes the “good samaritan” protection and would actually hinder the fight against sex trafficking. While large companies are more likely to continue their proactive enforcement efforts and can afford to fight lawsuits, if smaller platforms are made liable for “knowledge” of human trafficking occurring on their platforms, there is a risk that some will seek to avoid that “knowledge”; they will simply stop looking for it. This would be a disaster. We think it’s much better to foster an environment in which all technology companies can continue to clean their platforms and support effective tools for law enforcement and advocacy organizations to find and disrupt these networks. We’ve met with the sponsors of the particular bill and provided alternatives that will encourage this environment, and we’ll continue to seek a constructive approach to advance a shared goal.

We’re not alone in this view. Organizations as broad and diverse as Engine Advocacy, PEN America, Charles Koch Institute, Heritage Action, ACLU, U.S. Chamber Technology Engagement Center, Business Software Alliance, Internet Commerce Coalition, Internet Association (whose members include Microsoft, Twitter, Facebook, Amazon, Snap, Match.com, Pinterest, etc.), TechFreedom, Medium, GitHub, Reddit, Wikimedia, National Venture Capital Association and many others have raised concerns about the bill. We—and many others—stand ready to work with Congress on changes to the bill, and on other legislation and measures to fight human trafficking and protect and support victims and survivors.

A lot of the discussion around this issue focuses on the role of a website called Backpage.com. I want to make our position on this clear. Google believes that Backpage.com can and should be held accountable for its crimes. We strongly applaud the work of the Senate Permanent Subcommittee on Investigations in exposing Backpage's intentional promotion of child sex trafficking through ads. Based on those findings, Google believes that Backpage.com should be criminally prosecuted by the US Department of Justice for facilitating child sex trafficking, something they can do today without need to amend any laws. And years before the Senate's investigation and report, we prohibited Backpage from advertising on Google, and we have criticized Backpage publicly.

I understand that when important legislation is being discussed, public debate is robust. That’s how it should be. But on the crucial issue of sex trafficking, we’ve been a deeply committed partner in the fight. Let’s not let a genuine disagreement over the likely unintended impact of a particular piece of legislation obscure that fact.

A Significant Step Toward Modernizing Our Surveillance Laws

Last month, our General Counsel Kent Walker delivered a speech calling for a fundamental realignment of government access statutes in light of the growing role that technology plays in our daily lives, the expectation that communications should remain private, and the very real security threats that governments need to investigate.

In conjunction with the speech, we proposed a new framework oriented toward policy solutions that recognize legitimate law enforcement interests, respect the sovereignty of other countries, and reflect the reasonable expectation of privacy that users have in the content of their electronic communications.

The introduction of the International Communications Privacy Act (ICPA) by Senators Hatch, Coons, and Heller advances these objectives, and we commend these Senators for their leadership in this area.

ICPA would update the Electronic Communications Privacy Act (ECPA) in two important ways.

First, it would require U.S. government entities to obtain a warrant to compel the production of communications content from providers.  For many years, we have called upon the U.S. Congress to update ECPA in this manner, and the House of Representative has twice passed legislation (the Email Privacy Act) that would achieve this goal.

Second, it provides clear mechanisms for the U.S. government to obtain user data from service providers with a warrant, wherever the data may be stored, but with protections built in for certain cases when the users are nationals of other countries and are located outside the U.S.

We are eager to work with Members of Congress enact ICPA into law, and look forward to the opportunity to help advance this important bill.

A significant step toward modernizing our surveillance laws

Last month, our General Counsel Kent Walker delivered a speech calling for a fundamental realignment of government access statutes in light of the growing role that technology plays in our daily lives, the expectation that communications should remain private, and the very real security threats that governments need to investigate.

In conjunction with the speech, we proposed a new framework oriented toward policy solutions that recognize legitimate law enforcement interests, respect the sovereignty of other countries, and reflect the reasonable expectation of privacy that users have in the content of their electronic communications.

The introduction of the International Communications Privacy Act (ICPA) by Senators Hatch, Coons, and Heller advances these objectives, and we commend these Senators for their leadership in this area.

ICPA would update the Electronic Communications Privacy Act (ECPA) in two important ways.

First, it would require U.S. government entities to obtain a warrant to compel the production of communications content from providers.  For many years, we have called upon the U.S. Congress to update ECPA in this manner, and the House of Representative has twice passed legislation (the Email Privacy Act) that would achieve this goal.

Second, it provides clear mechanisms for the U.S. government to obtain user data from service providers with a warrant, wherever the data may be stored, but with protections built in for certain cases when the users are nationals of other countries and are located outside the U.S.

We are eager to work with Members of Congress enact ICPA into law, and look forward to the opportunity to help advance this important bill.

Applications now open for the Google Policy Fellowship in Europe and Africa

Are you an undergraduate, graduate or law student interested in internet and technology policy? Do you want to get involved in the public dialogue on these issues? If so, the new Google Policy Fellowship pilot programs in Italy, Belgium (Brussels), and three African countries may be for you.  

Successful applicants to the program will have the opportunity to work at public interest organizations at the forefront of debates on internet policy issues. They will be assigned a mentor at their host organizations and will have the opportunity to work with senior staff members.

Fellows will be expected to make substantive contributions to the work of their organization, including conducting policy research and analysis, drafting reports and white papers, attending government and industry meetings and conferences, and participating in other advocacy activities.

The work of the fellows is decided between the individuals and the organizations. Google provides a small stipend during the period of the fellowship, but has no involvement in defining or conducting the research. Typically, the fellows are postgraduates and they work with the organization on an area of research or study.

For example, in previous years, a fellow with the Strathmore Law School in Nairobi, Kenya, carried out a review of cyber-security conventions around the world, and a fellow at the Ghana-India Kofi Annan Centre of Excellence in ICT in Ghana helped to establish the Creative Commons chapter for Ghana before returning to university to finish her Ph.D. All work is carried out independently of Google.

Who should apply?

The organisations in the program are looking for students who are passionate about technology, and want to gain experience of working on public policy. Students from all majors and degree programs who possess the following qualities are encouraged to apply:

  • Demonstrated or stated interest in Internet and technology policy
  • Excellent academic record, professional/extracurricular/volunteer activities, subject matter expertise
  • First-rate analytical, communications, research, and writing skills
  • Ability to manage multiple projects simultaneously and efficiently, and to work smartly and resourcefully in a fast-paced environment

Brussels pilot

We are pleased to offer three fellowships, starting in September 2017, at the organizations listed below. These placements will run for six months and the stipend will vary slightly from organization to organization. To apply, please use the link below and send a short email, together with a CV. Deadline for applications is July 31, 2017.

Italy pilot

We’re pleased to offer six fellowships, starting in October 2017, and lasting up to six months, at the organizations listed below. To apply, please send a short email to the address below, together with a CV. Deadline for applications is August 27, 2017.

Africa program

We’re pleased to offer eight fellowships, starting from late August 2017, across Sub-Saharan Africa. The program will run for six to twelve months, with exact duration varying by organization. Detailed job descriptions can be viewed here. To apply, please complete the form at 2017 Africa Google Policy Fellowship Application. Deadline for applications is August 5, 2017. Beneath is a list of organization and locations for the fellowships.