Tag Archives: Public Policy

Defending access to lawful information at Europe’s highest court

Under the right to be forgotten, Europeans can ask for information about themselves to be removed from search results for their name if it is outdated, or irrelevant. From the outset, we have publicly stated our concerns about the ruling, but we have still worked hard to comply—and to do so conscientiously and in consultation with Data Protection Authorities. To date, we’ve handled requests to delist nearly 2 million search results in Europe, removing more than 800,000 of them. We have also taken great care not to erase results that are clearly in the public interest, as the European Court of Justice directed. Most Data Protection Authorities have concluded that this approach strikes the right balance.


But two right to be forgotten cases now in front of the European Court of Justice threaten that balance.


In the first case, four individuals—who we can’t name—present an apparently simple argument: European law protects sensitive personal data; sensitive personal data includes information about your political beliefs or your criminal record; so all mentions of criminality or political affiliation should automatically be purged from search results, without any consideration of public interest.


If the Court accepted this argument, it would give carte blanche to people who might wish to use privacy laws to hide information of public interest—like a politician’s political views, or a public figure’s criminal record. This would effectively erase the public’s right to know important information about people who represent them in society or provide them services.


In the second case, the Court must decide whether Google should enforce the right to be forgotten not just in Europe, but in every country around the world. We—and a wide range of human rights and media organizations, and others, like Wikimedia—believe that this runs contrary to the basic principles of international law: no one country should be able to impose its rules on the citizens of another country, especially when it comes to linking to lawful content. Adopting such a rule would encourage other countries, including less democratic regimes, to try to impose their values on citizens in the rest of the world.


We’re speaking out because restricting access to lawful and valuable information is contrary to our mission as a company and keeps us from delivering the comprehensive search service that people expect of us.


But the threat is much greater than this. These cases represent a serious assault on the public’s right to access lawful information.


We will argue in court for a reasonable interpretation of the right to be forgotten and for the ability of countries around the world to set their own laws, not have those of others imposed on them. Up to November 20, European countries and institutions have the chance to make their views known to the Court. And we encourage everyone who cares about public access to information to stand up and fight to preserve it.

Defending access to lawful information at Europe’s highest court

Under the right to be forgotten, Europeans can ask for information about themselves to be removed from search results for their name if it is outdated, or irrelevant. From the outset, we have publicly stated our concerns about the ruling, but we have still worked hard to comply—and to do so conscientiously and in consultation with Data Protection Authorities. To date, we’ve handled requests to delist nearly 2 million search results in Europe, removing more than 800,000 of them. We have also taken great care not to erase results that are clearly in the public interest, as the European Court of Justice directed. Most Data Protection Authorities have concluded that this approach strikes the right balance.


But two right to be forgotten cases now in front of the European Court of Justice threaten that balance.


In the first case, four individuals—who we can’t name—present an apparently simple argument: European law protects sensitive personal data; sensitive personal data includes information about your political beliefs or your criminal record; so all mentions of criminality or political affiliation should automatically be purged from search results, without any consideration of public interest.


If the Court accepted this argument, it would give carte blanche to people who might wish to use privacy laws to hide information of public interest—like a politician’s political views, or a public figure’s criminal record. This would effectively erase the public’s right to know important information about people who represent them in society or provide them services.


In the second case, the Court must decide whether Google should enforce the right to be forgotten not just in Europe, but in every country around the world. We—and a wide range of human rights and media organizations, and others, like Wikimedia—believe that this runs contrary to the basic principles of international law: no one country should be able to impose its rules on the citizens of another country, especially when it comes to linking to lawful content. Adopting such a rule would encourage other countries, including less democratic regimes, to try to impose their values on citizens in the rest of the world.


We’re speaking out because restricting access to lawful and valuable information is contrary to our mission as a company and keeps us from delivering the comprehensive search service that people expect of us.


But the threat is much greater than this. These cases represent a serious assault on the public’s right to access lawful information.


We will argue in court for a reasonable interpretation of the right to be forgotten and for the ability of countries around the world to set their own laws, not have those of others imposed on them. Up to November 20, European countries and institutions have the chance to make their views known to the Court. And we encourage everyone who cares about public access to information to stand up and fight to preserve it.

Defending access to lawful information at Europe’s highest court

Under the right to be forgotten, Europeans can ask for information about themselves to be removed from search results for their name if it is outdated, or irrelevant. From the outset, we have publicly stated our concerns about the ruling, but we have still worked hard to comply—and to do so conscientiously and in consultation with Data Protection Authorities. To date, we’ve handled requests to delist nearly 2 million search results in Europe, removing more than 800,000 of them. We have also taken great care not to erase results that are clearly in the public interest, as the European Court of Justice directed. Most Data Protection Authorities have concluded that this approach strikes the right balance.


But two right to be forgotten cases now in front of the European Court of Justice threaten that balance.


In the first case, four individuals—who we can’t name—present an apparently simple argument: European law protects sensitive personal data; sensitive personal data includes information about your political beliefs or your criminal record; so all mentions of criminality or political affiliation should automatically be purged from search results, without any consideration of public interest.


If the Court accepted this argument, it would give carte blanche to people who might wish to use privacy laws to hide information of public interest—like a politician’s political views, or a public figure’s criminal record. This would effectively erase the public’s right to know important information about people who represent them in society or provide them services.


In the second case, the Court must decide whether Google should enforce the right to be forgotten not just in Europe, but in every country around the world. We—and a wide range of human rights and media organizations, and others, like Wikimedia—believe that this runs contrary to the basic principles of international law: no one country should be able to impose its rules on the citizens of another country, especially when it comes to linking to lawful content. Adopting such a rule would encourage other countries, including less democratic regimes, to try to impose their values on citizens in the rest of the world.


We’re speaking out because restricting access to lawful and valuable information is contrary to our mission as a company and keeps us from delivering the comprehensive search service that people expect of us.


But the threat is much greater than this. These cases represent a serious assault on the public’s right to access lawful information.


We will argue in court for a reasonable interpretation of the right to be forgotten and for the ability of countries around the world to set their own laws, not have those of others imposed on them. Up to November 20, European countries and institutions have the chance to make their views known to the Court. And we encourage everyone who cares about public access to information to stand up and fight to preserve it.

Security and disinformation in the U.S. 2016 election

We’ve seen many types of efforts to abuse Google’s services over the years. And, like other internet platforms, we have found some evidence of efforts to misuse our platforms during the 2016 U.S. election by actors linked to the Internet Research Agency in Russia. 

Preventing the misuse of our platforms is something that we take very seriously; it’s a major focus for our teams. We’re committed to finding a way to stop this type of abuse, and to working closely with governments, law enforcement, other companies, and leading NGOs to promote electoral integrity and user security, and combat misinformation. 

We have been conducting a thorough investigation related to the U.S. election across our products drawing on the work of our information security team, research into misinformation campaigns from our teams, and leads provided by other companies. Today, we are sharing results from that investigation. While we have found only limited activity on our services, we will continue to work to prevent all of it, because there is no amount of interference that is acceptable.

We will be launching several new initiatives to provide more transparency and enhance security, which we also detail in these information sheets: what we found, steps against phishing and hacking, and our work going forward.

Our work doesn’t stop here, and we’ll continue to investigate as new information comes to light. Improving transparency is a good start, but we must also address new and evolving threat vectors for misinformation and attacks on future elections. We will continue to do our best to help people find valuable and useful information, an essential foundation for an informed citizenry and a robust democratic process.

Security and disinformation in the U.S. 2016 election

We’ve seen many types of efforts to abuse Google’s services over the years. And, like other internet platforms, we have found some evidence of efforts to misuse our platforms during the 2016 U.S. election by actors linked to the Internet Research Agency in Russia. 

Preventing the misuse of our platforms is something that we take very seriously; it’s a major focus for our teams. We’re committed to finding a way to stop this type of abuse, and to working closely with governments, law enforcement, other companies, and leading NGOs to promote electoral integrity and user security, and combat misinformation. 

We have been conducting a thorough investigation related to the U.S. election across our products drawing on the work of our information security team, research into misinformation campaigns from our teams, and leads provided by other companies. Today, we are sharing results from that investigation. While we have found only limited activity on our services, we will continue to work to prevent all of it, because there is no amount of interference that is acceptable.

We will be launching several new initiatives to provide more transparency and enhance security, which we also detail in these information sheets: what we found, steps against phishing and hacking, and our work going forward.

Our work doesn’t stop here, and we’ll continue to investigate as new information comes to light. Improving transparency is a good start, but we must also address new and evolving threat vectors for misinformation and attacks on future elections. We will continue to do our best to help people find valuable and useful information, an essential foundation for an informed citizenry and a robust democratic process.

Towards a future of work that works for everyone

The future of work concerns us all. Our grandchildren will have jobs that don’t yet exist, and will live lives we cannot imagine. In Europe, getting the future of work right for individuals, societies and industries means having an open debate about the possibilities right now. We want to be a part of that discussion, and help contribute to a future of work that works for everyone. So last week in Stockholm and The Hague we brought together a range of leading international experts from academia, trade unions, public sector and businesses to discuss the impact of technology on jobs. We also asked McKinsey for a report on the impact of automation on work, jobs and skills.


As advances in machine learning and robotics make headlines, there’s a heated debate about whether innovation is a magic fix for an aging workforce, or a fast track to mass unemployment. Data can illuminate that debate, and McKinsey focused their research on the Nordics, Benelux, Ireland and Estonia—a diverse group which have at least one thing in common: They’re Europe’s digital frontrunners. The report from McKinsey shows us that while automation will impact existing jobs, innovation and adopting new technology can increase the total number of jobs available.


The report makes it very clear that divergent paths are possible. To make a success of the digital transition, countries should promote adoption of new technologies and double down on skills training and education. We want to play our part here. One example of how we contribute is our program Digitalakademin in Sweden: So far, we’ve trained more than 20,000 people in small- and medium-sized business in digital skills. And together with the Swedish National Employment Agency we’ve developed training to help unemployed people get the skills necessary for the jobs of the future.

As Erik Sandström from the National Employment Agency stressed at our event in Stockholm, it “all starts with digital competence—if you’re lacking in digital competence you will overestimate the risks and underestimate the opportunities.” That sentiment was echoed in a keynote by Ylva Johansson, the Swedish Minister for Employment and Integration: “Why do we have an attitude where unions, employees are positively accepting ongoing changes? Because we’ve been able to protect people and to present new opportunities through reskilling.”

For our event in The Hague we partnered with Dutch company Randstad to discuss the same topic of future of work. Their CEO, Jacques van den Broek, struck an optimistic tone: “The digital transformation is an opportunity, not a threat,” he said. “The lesson we’ve learned is that whilst some jobs disappear, tech creates jobs. The longer you wait to embrace that change, the longer it takes to be able to compete.”


The coming changes will likely affect a wide range of tasks and jobs. “In Denmark, we discussed the destruction of jobs,” Thomas Søby from the Danish Steelworkers Union said. “New ones are created,” he added. “But some people will lose their jobs and feel left behind, and as a society we need to take care of those people.”


Those new jobs aren’t simply replacements—they’re roles we don’t have yet. “In a few years something else will be hot,” said Aart-Jan de Geus of Bertelsmann Stiftung, a German private foundation which looks at managing future challenges. He stressed that fears about job losses shouldn’t be overstated, especially as consumer demand and spending won’t go away. “The big mistake would be to try to protect jobs; we need to protect workers.”


In The Hague, Eric Schmidt, Alphabet’s executive chairman, ended on a positive note, saying that anxiety about change was understandable but that society can make sure the digital transition includes everyone. “Incumbents resist change. This is not new and in fact we have seen it throughout every stage of history,” he said. “But if history has taught us anything, it is that when disruptors and pioneers are right, society always recalibrates.”

Towards a future of work that works for everyone

The future of work concerns us all. Our grandchildren will have jobs that don’t yet exist, and will live lives we cannot imagine. In Europe, getting the future of work right for individuals, societies and industries means having an open debate about the possibilities right now. We want to be a part of that discussion, and help contribute to a future of work that works for everyone. So last week in Stockholm and The Hague we brought together a range of leading international experts from academia, trade unions, public sector and businesses to discuss the impact of technology on jobs. We also asked McKinsey for a report on the impact of automation on work, jobs and skills.


As advances in machine learning and robotics make headlines, there’s a heated debate about whether innovation is a magic fix for an aging workforce, or a fast track to mass unemployment. Data can illuminate that debate, and McKinsey focused their research on the Nordics, Benelux, Ireland and Estonia—a diverse group which have at least one thing in common: They’re Europe’s digital frontrunners. The report from McKinsey shows us that while automation will impact existing jobs, innovation and adopting new technology can increase the total number of jobs available.


The report makes it very clear that divergent paths are possible. To make a success of the digital transition, countries should promote adoption of new technologies and double down on skills training and education. We want to play our part here. One example of how we contribute is our program Digitalakademin in Sweden: So far, we’ve trained more than 20,000 people in small- and medium-sized business in digital skills. And together with the Swedish National Employment Agency we’ve developed training to help unemployed people get the skills necessary for the jobs of the future.

As Erik Sandström from the National Employment Agency stressed at our event in Stockholm, it “all starts with digital competence—if you’re lacking in digital competence you will overestimate the risks and underestimate the opportunities.” That sentiment was echoed in a keynote by Ylva Johansson, the Swedish Minister for Employment and Integration: “Why do we have an attitude where unions, employees are positively accepting ongoing changes? Because we’ve been able to protect people and to present new opportunities through reskilling.”

For our event in The Hague we partnered with Dutch company Randstad to discuss the same topic of future of work. Their CEO, Jacques van den Broek, struck an optimistic tone: “The digital transformation is an opportunity, not a threat,” he said. “The lesson we’ve learned is that whilst some jobs disappear, tech creates jobs. The longer you wait to embrace that change, the longer it takes to be able to compete.”


The coming changes will likely affect a wide range of tasks and jobs. “In Denmark, we discussed the destruction of jobs,” Thomas Søby from the Danish Steelworkers Union said. “New ones are created,” he added. “But some people will lose their jobs and feel left behind, and as a society we need to take care of those people.”


Those new jobs aren’t simply replacements—they’re roles we don’t have yet. “In a few years something else will be hot,” said Aart-Jan de Geus of Bertelsmann Stiftung, a German private foundation which looks at managing future challenges. He stressed that fears about job losses shouldn’t be overstated, especially as consumer demand and spending won’t go away. “The big mistake would be to try to protect jobs; we need to protect workers.”


In The Hague, Eric Schmidt, Alphabet’s executive chairman, ended on a positive note, saying that anxiety about change was understandable but that society can make sure the digital transition includes everyone. “Incumbents resist change. This is not new and in fact we have seen it throughout every stage of history,” he said. “But if history has taught us anything, it is that when disruptors and pioneers are right, society always recalibrates.”

Towards a future of work that works for everyone

The future of work concerns us all. Our grandchildren will have jobs that don’t yet exist, and will live lives we cannot imagine. In Europe, getting the future of work right for individuals, societies and industries means having an open debate about the possibilities right now. We want to be a part of that discussion, and help contribute to a future of work that works for everyone. So last week in Stockholm and The Hague we brought together a range of leading international experts from academia, trade unions, public sector and businesses to discuss the impact of technology on jobs. We also asked McKinsey for a report on the impact of automation on work, jobs and skills.


As advances in machine learning and robotics make headlines, there’s a heated debate about whether innovation is a magic fix for an aging workforce, or a fast track to mass unemployment. Data can illuminate that debate, and McKinsey focused their research on the Nordics, Benelux, Ireland and Estonia—a diverse group which have at least one thing in common: They’re Europe’s digital frontrunners. The report from McKinsey shows us that while automation will impact existing jobs, innovation and adopting new technology can increase the total number of jobs available.


The report makes it very clear that divergent paths are possible. To make a success of the digital transition, countries should promote adoption of new technologies and double down on skills training and education. We want to play our part here. One example of how we contribute is our program Digitalakademin in Sweden: So far, we’ve trained more than 20,000 people in small- and medium-sized business in digital skills. And together with the Swedish National Employment Agency we’ve developed training to help unemployed people get the skills necessary for the jobs of the future.

As Erik Sandström from the National Employment Agency stressed at our event in Stockholm, it “all starts with digital competence—if you’re lacking in digital competence you will overestimate the risks and underestimate the opportunities.” That sentiment was echoed in a keynote by Ylva Johansson, the Swedish Minister for Employment and Integration: “Why do we have an attitude where unions, employees are positively accepting ongoing changes? Because we’ve been able to protect people and to present new opportunities through reskilling.”

For our event in The Hague we partnered with Dutch company Randstad to discuss the same topic of future of work. Their CEO, Jacques van den Broek, struck an optimistic tone: “The digital transformation is an opportunity, not a threat,” he said. “The lesson we’ve learned is that whilst some jobs disappear, tech creates jobs. The longer you wait to embrace that change, the longer it takes to be able to compete.”


The coming changes will likely affect a wide range of tasks and jobs. “In Denmark, we discussed the destruction of jobs,” Thomas Søby from the Danish Steelworkers Union said. “New ones are created,” he added. “But some people will lose their jobs and feel left behind, and as a society we need to take care of those people.”


Those new jobs aren’t simply replacements—they’re roles we don’t have yet. “In a few years something else will be hot,” said Aart-Jan de Geus of Bertelsmann Stiftung, a German private foundation which looks at managing future challenges. He stressed that fears about job losses shouldn’t be overstated, especially as consumer demand and spending won’t go away. “The big mistake would be to try to protect jobs; we need to protect workers.”


In The Hague, Eric Schmidt, Alphabet’s executive chairman, ended on a positive note, saying that anxiety about change was understandable but that society can make sure the digital transition includes everyone. “Incumbents resist change. This is not new and in fact we have seen it throughout every stage of history,” he said. “But if history has taught us anything, it is that when disruptors and pioneers are right, society always recalibrates.”

Updating our Transparency Report and electronic privacy laws

Today, we are releasing the latest version of our Transparency Report concerning government requests for user data. This includes government requests for user data in criminal cases, as well as national security matters under U.S. law. Google fought for the right to publish this information in court and before Congress, and we continue to believe that this type of transparency can inform the broader debate about the nature and scope of government surveillance laws and programs.


In the first half of 2017, worldwide, we received 48,941 government requests that relate to 83,345 accounts. You can see more detailed figures, including a country-by-country breakdown of requests, here. We’ve also posted updated figures for the number of users/accounts impacted by Foreign Intelligence Surveillance Act (FISA) requests for content in previous reporting periods. While the total number of FISA content requests was reported accurately, we inadvertently under-reported the user/account figures in some reporting periods and over-reported the user/account figures in the second half of 2010. The corrected figures are in the latest report and reflected on our visible changes page.


Updating Electronic Privacy Laws


We are publishing the latest update to our Transparency Report as the U.S. Congress embarks upon an important debate concerning the nature and scope of key FISA provisions. Section 702 of the FISA Amendments Act of 2008 expires at the end of 2017. This is the section of FISA that authorizes the U.S. government to compel service providers like Google to disclose user data (including communications content) about non-U.S. persons in order to acquire “foreign intelligence information.”


Earlier this year, we expressed support for specific reforms to Section 702. We continue to believe that Congress can enact reforms to Section 702 in a way that enhances privacy protection for internet users while protecting national security. Independent bodies have concluded that Section 702 is valuable and effective in protecting national security and producing useful foreign intelligence. These assessments, however, do not preclude reforms that improve privacy protections for U.S. and non-U.S. persons and that do not disturb the core purposes of Section 702.


Government access laws are due for a fundamental realignment and update in light of the proliferation of technology, the very real security threats to people, and the expectations of privacy that Internet users have in their communications. Our General Counsel, Kent Walker, delivered a speech earlier this year calling for a new framework to address cross-border law enforcement requests. Updates to the Electronic Communications Privacy Act (ECPA) will be necessary to create a legal framework that addresses both law enforcement and civil liberties concerns.


The recent introduction of the International Communications Privacy Act (ICPA) in the Senate and the House is a significant step in the right direction, and we applaud Senators Hatch, Coons, and Heller and Representatives Collins, Jeffries, Issa, and DeBene for their leadership on this important bill. ECPA should also be updated to enable countries that commit to baseline privacy, due process, and human rights principles to make direct requests to U.S. providers. Providing a pathway for such countries to obtain electronic evidence directly from service providers in other jurisdictions will remove incentives for the unilateral, extraterritorial assertion of a country’s laws, data localization proposals, aggressive expansion of government access authorities, and dangerous investigative techniques. These measures ultimately weaken privacy, due process, and human rights standards.


We look forward to continuing in the constructive discussion about these issues.


Updating our Transparency Report and electronic privacy laws

Today, we are releasing the latest version of our Transparency Report concerning government requests for user data. This includes government requests for user data in criminal cases, as well as national security matters under U.S. law. Google fought for the right to publish this information in court and before Congress, and we continue to believe that this type of transparency can inform the broader debate about the nature and scope of government surveillance laws and programs.


In the first half of 2017, worldwide, we received 48,941 government requests that relate to 83,345 accounts. You can see more detailed figures, including a country-by-country breakdown of requests, here. We’ve also posted updated figures for the number of users/accounts impacted by Foreign Intelligence Surveillance Act (FISA) requests for content in previous reporting periods. While the total number of FISA content requests was reported accurately, we inadvertently under-reported the user/account figures in some reporting periods and over-reported the user/account figures in the second half of 2010. The corrected figures are in the latest report and reflected on our visible changes page.


Updating Electronic Privacy Laws


We are publishing the latest update to our Transparency Report as the U.S. Congress embarks upon an important debate concerning the nature and scope of key FISA provisions. Section 702 of the FISA Amendments Act of 2008 expires at the end of 2017. This is the section of FISA that authorizes the U.S. government to compel service providers like Google to disclose user data (including communications content) about non-U.S. persons in order to acquire “foreign intelligence information.”


Earlier this year, we expressed support for specific reforms to Section 702. We continue to believe that Congress can enact reforms to Section 702 in a way that enhances privacy protection for internet users while protecting national security. Independent bodies have concluded that Section 702 is valuable and effective in protecting national security and producing useful foreign intelligence. These assessments, however, do not preclude reforms that improve privacy protections for U.S. and non-U.S. persons and that do not disturb the core purposes of Section 702.


Government access laws are due for a fundamental realignment and update in light of the proliferation of technology, the very real security threats to people, and the expectations of privacy that Internet users have in their communications. Our General Counsel, Kent Walker, delivered a speech earlier this year calling for a new framework to address cross-border law enforcement requests. Updates to the Electronic Communications Privacy Act (ECPA) will be necessary to create a legal framework that addresses both law enforcement and civil liberties concerns.


The recent introduction of the International Communications Privacy Act (ICPA) in the Senate and the House is a significant step in the right direction, and we applaud Senators Hatch, Coons, and Heller and Representatives Collins, Jeffries, Issa, and DeBene for their leadership on this important bill. ECPA should also be updated to enable countries that commit to baseline privacy, due process, and human rights principles to make direct requests to U.S. providers. Providing a pathway for such countries to obtain electronic evidence directly from service providers in other jurisdictions will remove incentives for the unilateral, extraterritorial assertion of a country’s laws, data localization proposals, aggressive expansion of government access authorities, and dangerous investigative techniques. These measures ultimately weaken privacy, due process, and human rights standards.


We look forward to continuing in the constructive discussion about these issues.