Tag Archives: Public Policy

Applications are open for the Google North America Public Policy Fellowship

Starting today, we’re accepting applications for the 2019 North America Google Policy Fellowship. Our fellowship gives undergraduate and graduate students a paid opportunity to spend 10-weeks diving head first into Internet policy at leading nonprofits, think tanks and advocacy groups. In addition to opportunities in Washington, D.C. and California, we’ve expanded our program to include academic institutions and advocacy groups in New York and Utah, where students will have the chance to be at the forefront of debates on internet freedom and economic opportunity. We’re looking for students from all majors and degree programs who are passionate about technology and want to gain hands on experience exploring important intersections of tech policy.

The application period opens today for the North America region and all applications must be received by 12:00 p.m. ET/9 a.m. PT, Friday, February, 15th. This year's program will run from early June through early August, with regular programming throughout the summer. More specific information, including a list of this year’s hosts and locations, can be found on our site.

You can learn about the program, application process and host organizations on the Google Public Policy Fellowship website.

Principles for evolving technology policy in 2019

The past year has seen a range of public debates about the roles and responsibilities of technology companies. As 2019 begins, I’d like to share my thoughts on these important discussions and why Google supports smart regulation and other ways to address emerging issues.

We’ve always been and still are fundamentally optimistic about the power of innovative technology. We’re proud that Google’s products and services empower billions of people, drive economic growth and offer important tools for your everyday life. This takes many forms, whether it’s instant access to the world’s information, an infinite gallery of sortable photos, tools that let you share documents and calendars with friends, directions that help you avoid traffic jams, or whatever Google tool you find most helpful.

But this optimism doesn’t obscure the challenges we face—including those posed by misuse of new technologies. New tools inevitably affect not just the people and businesses who use them, but also cultures, economies and societies as a whole. We’ve come a long way from our days as a scrappy startup, and with billions of people using our services every day, we recognize the need to confront tough issues regarding technology's impacts.

The scrutiny of lawmakers and others often improves our products and the policies that govern them. It’s sometimes claimed that the internet is an unregulated “wild west,” but that's not the case. Many laws and regulations have contributed to the internet’s vitality: competition and consumer protection laws, advertising regulations, and copyright, to name just a few. Existing legal frameworks reflect trade-offs that help everyone reap the benefits of modern technologies, minimize social costs, and respect fundamental rights. As technology evolves, we need to stay attuned to how best to improve those rules.

In some cases, laws do need updates, as we laid out in our recent post on data protection and our proposal regarding law enforcement access to data. In other cases, collaboration among industry, government, and civil society may lead to complementary approaches, like joint industry efforts to fight online terrorist content, child sexual abuse material and copyright piracy. Shared concerns can also lead to ways to empower people with new tools and choices, like helping people control and move their data—that’s why we have been a leader since 2007 in developing data portability tools and last year helped launch the cross-company Data Transfer Project.

We don’t see smart regulation as a singular end state, it must develop and evolve. In an era (and a sector) of rapid change, one-size-fits-all solutions are unlikely to work out well. Instead, it's important to start with a focus on a specific problem and seek well-tailored and well-informed solutions, thinking through the benefits, the second-order impacts, and the potential for unintended side-effects.

Efforts to address illegal and harmful online content illustrate how tech companies can play a supportive role in this process:

  • First, to support constructive transparency, we launched our Transparency Report more than eight years ago, and we have continued to extend our transparency efforts over time, most recently with YouTube’s Community Guidelines enforcement report.

  • Second, to cultivate best practices for responsible content removals, we’ve supported initiatives like the Global Internet Forum to Counter Terrorism, where tech companies, governments and civil society have worked together to stop exploitation of online services.

  • Finally, we have participated in government-overseen systems of accountability. For instance, the EU’s Hate Speech Code of Conduct includes an audit process to monitor how platforms are meeting our commitments. And in the recent EU Code of Practice On Disinformation, we agreed to help researchers study this topic and to regular reporting and assessment of our next steps in this fight.

While the world is no longer at the start of the Information Revolution, the most important and exciting chapters are still to come. Google has pioneered a number of new artificial intelligence (AI) tools, and published a set of principles to guide our work and inform the larger public debate about the use of these remarkable technologies. We’ll have more to say about issues in AI governance in the coming weeks. Of course, every new breakthrough will raise its own set of new issues—and we look forward to hearing from others and sharing our own thoughts and ideas.

To stop terror content online, tech companies need to work together

Wherever we live, whatever our background, we’ve all seen the pain caused by senseless acts of terrorism. Just last week, the tragic murder of Christmas shoppers in Strasbourg was a sobering reminder that terrorist attacks can happen at any time.

What is clear from such attacks is that we all—government, industry, and civil society—have to remain vigilant and work together to address this continuing threat. While governments and civil society groups face a complex challenge in deterring terrorist violence, collaboration across the industry to responsibly address terrorist content online is delivering progress. And more tech companies must join the fight against terrorist content online.

In June 2017 senior representatives from Facebook, Microsoft, Twitter and YouTube came together to form the Global Internet Forum to Counter Terrorism (GIFCT), a coalition to share information on how to best curb the spread of terrorism online. I’ve had the responsibility of chairing this Forum for its initial a year and a half, and I’m pleased to report that the Forum has helped to deliver significant results across a number of areas.

In September 2017 at the United Nations General Assembly, I joined the leaders of the United Kingdom, France, and Italy to discuss what more the tech industry could do to combat terrorist content. I was there on behalf of the GIFCT member companies to present our commitments to tackle terrorism online: We collectively pledged to develop and share technology to responsibly address terrorist content across the industry; to fund research and share good practices that help all companies stay abreast of the latest trends; and to elevate positive counter messages.  

We understand that we must responsibly lead the way in developing new technologies and standards for identifying and removing harmful terrorist content. As EU Commissioner Avromopolis said: “The tools you are developing yourselves on your platforms are the most effective counter-measures we all have. That is why I am a strong supporter of your efforts under the Global Internet Forum to Counter Terrorism.” A key pillar of GIFCT’s work to drive progress is maintaining a shared database of digital fingerprints (hashes) of known terrorist content that lets any member of the coalition automatically find and remove identical terrorist content on their platforms. In 2018, we set—and achieved—an ambitious goal of depositing 100k new hashes in the database.

Over the past year and a half, we’ve also engaged smaller businesses around the world to discuss their unique needs and to share ways to responsibly address terrorist content online. With the UN’s counterterrorism directorate and the UN-initiated TechAgainstTerrorism program, we’ve worked with more than 100 tech companies on four continents. We also convened forums in Europe, the Asia Pacific region, and Silicon Valley for companies, civil society groups, and governments to share experiences and get suggestions for further efforts.

To enhance our understanding of the latest trends in online terrorist propaganda, GIFCT has been working with a research network led by the Royal United Services Institute. We are speaking with its network of eight think tanks around the world about how terrorist networks operate online, the ethics of content moderation, and the interplay between online content and offline actions. That network will publish ten academic papers over the next six months to benefit everyone working on the problem of terrorist content online.

We’ve also successfully worked alongside governments and Internet Referral Units, like Europol to get terrorist content down even more quickly. With civil society organizations, we’ve developed a tool that will help them mount counter extremism campaigns across many online platforms at once. And together with Google.org, we launched a $5 million innovation fund to counter hate and extremism. The fund gives grants to nonprofits that are countering hate, both online and off. Our £1M pilot program in the UK received over 230 applications, and we awarded grants to 22 initiatives.

These are significant developments for the industry, but we know we have much more to do. The Forum will continue to expand our membership, vastly increase the size of our database of hashes, and do even more to help small companies and academic websites responsibly address terrorist content.

We can never be complacent against the continuing threat of terrorism. The work being done today by our coalition members has helped limit the use of our platforms by terrorist organizations, and we have extended an open invitation to others in the industry to join with us in this effort. Working together, we will continue to develop and implement solutions across the industry to protect our users, our societies, and a free and open internet.

A “First Step” towards criminal justice reform

For the first time in 22 years, Alice Johnson will be home for the holidays. Now a great-grandmother, Johnson was sentenced to life in prison without parole for a first-time, non-violent drug felony in 1997. She had spent over two decades behind bars when her story gained national attention—prompting President Trump to officially reduce Johnson’s sentence and send her home free earlier this year. Johnson’s case set off a long-overdue debate around the country about harsh sentencing laws and the need to reform our criminal justice system.

We’ve long supported efforts to end mass incarceration and help individuals like Johnson get a second chance. In 2017, we collaborated on a YouTube video in which Johnson urged the public from her prison cell to advocate for the release of those serving life sentences for nonviolent offenses. We later partnered with Mic.com to produce a digital op-ed, which caught the attention of Kim Kardashian West and inspired her to take up Johnson’s cause.

America’s thirty-year experiment with mandatory minimum sentences and sweeping criminalization has too often imposed unfair and disproportionate penalties on people across the country. As a former prosecutor, I have witnessed many individuals and families bear the consequences of these policies—policies that haven’t made us any safer, but have cost millions in taxpayer dollars and cast a pall over many lives.

This week, Congress—in a rare show of bipartisan consensus—passed the First Step Act, changing these policies and reforming our criminal justice system. The legislation lowers mandatory minimum sentences for drug felonies, reduces the disparity in sentencing guidelines between crack and powder cocaine offenses, and gives judges the discretion to shorten mandatory minimum sentences for low-level crimes. President Trump has already expressed support for the bill, and we look forward to him quickly signing it into law.

The Act marks an important step forward in restoring equal justice and due process, and promoting consistency and fairness in sentencing. Moreover, the Act includes measures that will bolster rehabilitation programs in prisons across the country to help incarcerated women and men successfully re-enter society, reduce recidivism rates, and make our communities safer.

Google.org has long backed these kinds of efforts to improve our criminal justice system. We’ve supported work by non-profits promoting reform and by police departments working to improve interactions with their communities. We have promoted the use of data to increase the transparency of our criminal justice system. And we have launched programs like our digital LoveLetters initiative, which supports children with imprisoned parents.  

While we’re encouraged by the passage of the First Step Act, there is still more work to be done at the federal, state, and local level to improve our criminal justice system. And we all have a part to play. As an example, our company policies seek to promote fair hiring by “banning the box” (requiring job applicants to disclose criminal history only once they get a chance to interview) and encouraging our suppliers to do the same. And we don’t accept ads for bail bonds, an industry with an unfortunate history of predatory practices.

We look forward to continuing to work with people from many backgrounds and across a spectrum of views, united in our belief that America’s legal and criminal justice systems can and should be an example to the world.

From soil to supper: How technology influences your dinner

What does artificial intelligence have to do with the dinner on your plate? It might seem like they’re unrelated, but technology like AI contributes to your everyday life, including the food you eat. Farmers around the world are using AI-infused apps like FARMWAVE and PlantVillage to diagnose and treat pests that might otherwise destroy an entire harvest.

We’re working to make it easier for anyone to manage and learn from data—like allowing developers to use machine learning with TensorFlow and models we’ve released on GitHub. Today, Google has over 2,000 open source projects. In the food space, our open source tools have supported the development of programs that sort cucumbers and track and monitor the health of dairy cows.

But we wanted to learn even more about how technology is influencing our food. That's why over the last year, we’ve met with experts across the nation’s food supply chain to discuss and co-author the Refresh: Food + Tech, from Soil to Supper report. The report covers the important role technology plays in the production, distribution and consumption of our food. After working on the report over the past year, two major themes became clear.

A recording of the Refresh event.

Watch the full Refresh launch event at the link above.

Technological breakthroughs happen everywhere

We recently hosted a launch event in Chicago where we were joined by former Secretary of Agriculture Tom Vilsack, who worked at the USDA during a dynamic time for technology in agriculture. At the time, precision farming and other smart-farm tech were changing the agricultural landscape—but in many ways, these emerging technologies are a natural fit with life on the farm. Secretary Vilsack observed parallels between the farm and Silicon Valley, noting that innovation and risk-taking are inherent in the business of farming as well.


Former Secretary of Agriculture Tom Vilsack talks about technology in agriculture with Danielle Nierenberg of Food Tank

Examples of using technology on a farm run the gamut, from automated data analytics to wearables. For example, Refresh Working Group member Melissa Brandao created HerdDogg, a health tracking wearable for cows, to help dairy farmers monitor the day-to-day health of each individual cow in their herd. And working group member Amanda Ramcharan uses plant-spotting drones to evaluate soil health and predict weather patterns that help to inform standards for agricultural sustainability.

Diverse perspectives are crucial

For new technologies to improve the food supply chain, we have to make sure that a variety of communities and stakeholders are involved in the process from the start. If AI is meant to be a useful tool for food production, distribution and consumption, these systems must be designed to meet the day-to-day needs of farmers, retailers and food workers.


Panel discussion with myself from Google, Don Bustos of Santa Cruz Farm, Danielle Nierenberg of Food Tank and Ankita Raturi of the USDA’s Agricultural Research Service

For example, technology has to be reliable in order for it to be useful for smaller operators. Don Bustos of Santa Cruz Farm in Española, New Mexico was an early adopter of solar technologies for year-round food production. His community depends on his harvest as a critical food source, so he pointed out that any AI applications on his farm must be both affordable and reliable, or he risks a poor crop yield.

And it has to bring people with diverse perspectives together to work toward new solutions. Refresh Working Group member Craig Ganssle, who created the communication platform FARMWAVE, understands that community is critical to the success of any technology. This platform provides farmers with a real-time community of peers worldwide, so that they can pool their expertise and learn from one another as they make critical decisions on their farms.

We couldn’t have learned about the complexities of our nation’s food system without the participation of this engaged group of farmers, small business owners, researchers, nonprofit leaders and community organizers. Sign up for updates to join the conversation next year, and be the first to learn about our next event at SXSW 2019.

Protecting what we love about the internet: our efforts to stop online piracy

The internet has enabled people worldwide to connect, create and distribute new works of art like never before. A key part of preserving this creative economy is ensuring creators and artists have a way to share and make money from their content—and preventing the flow of money to those who seek to pirate that content. Today, we're releasing our latest update on those efforts..

Our 2018 "How Google Fights Piracy" report explains the programs, policies, and technology we put in place to combat piracy online and ensure continued opportunities for creators around the world.

We invest significantly in the technology, tools and resources that prevent copyright infringement on our platforms. We also work with others across the industry on efforts to combat piracy. These efforts appear to be having an effect: around the world, online piracy has been decreasing, while spending on legitimate content is rising across content categories.

Here are a few of our findings from this year's Piracy report:

  • $3 billion+:The amount YouTube has paid to rights holders who have monetized use of their content in other videos through Content ID, our industry-leading rights management tool.
  • $100 million+: The amount we’ve invested in building Content ID, including staffing and computing resources.
  • $1.8 billion+:The amount YouTube paid to the music industry from October 2017 to September 2018 in advertising revenue alone
  • 3 billion+:The number of URLs that were removed from Search for infringing copyright since launching a submission tool for copyright owners and their agents.
  • 10 million+:The number of ads that were disapproved by Google in 2017 that were suspected of copyright infringement or that linked to infringing sites.

As we continue our work in the years ahead, five principles guide our substantial investments in fighting piracy:

Create more and better legitimate alternatives: Piracy often arises when it's difficult for consumers to access legitimate content. By developing products that make it easy for users to access legitimate content, like Google Play Music and YouTube, Google helps drive revenue for creative industries and give consumers choice.

Follow the money: As the vast majority of sites dedicated to online piracy are doing so to make money, one way to combat them is to cut off their supply. We prevent actors that engage in copyright infringement from using our ads and monetization systems and we enforce these policies rigorously.

Be efficient, effective, and scalable: We strive to implement anti-piracy solutions that work at scale. For example, as early as 2010, we began making substantial investments in streamlining the copyright removal process for search results. As a result, these improved procedures allow us to process copyright removal requests for search results at the rate of millions per week.

Guard against abuse: Some actors will make false copyright infringement claims in order to have content they don't want online taken down. We’re committed to detecting and rejecting bogus infringement allegations, such as removals for political or competitive reasons.

Provide transparency: We’re committed to providing transparency. In our Transparency Report, we disclose the number of requests we receive from copyright owners and governments to remove information from our services.

Today, our services are generating more revenue for creators and rights holders, connecting more people with the content they love, and doing more to fight back against online piracy than ever before. We’re proud of the progress this report represents. Through continued innovation and partnership, we’re committed to curtailing infringement by bad actors while empowering the creative communities who make many of the things we love about the internet today.

Source: Search

Parent helpline answers: How do I keep my family safe from opioid addiction?

Editor’s Note: This Saturday, October 27 is National Prescription Drug Take Back Day. Across the nation, people are disposing of their leftover, unneeded prescription drugs at local Take Back events to prevent drug misuse. Google has partnered with the DEA to make these locations easier to find. Visit g.co/rxtakeback to find a location near you and make a plan to bring back your prescriptions.

Earlier this year, Google.org gave $750,000 to the Partnership for Drug-Free Kids to expand and improve our Parent Helpline that supports parents and other caregivers of young people struggling with substance use. As the mother of a child in recovery, I’ve seen firsthand how opioid addiction hurts our loved ones, families and our communities. I also work with the Partnership to help educate about opioid use and addiction, and I volunteer as a Parent Coach – providing peer-to-peer support to other families.

Today I'm sharing some of the most frequently asked questions I hear from parents about opioid addiction.

Aren’t opioids legally prescribed by doctors, and therefore safe?

Even though opioid pain relievers can be prescribed by doctors to manage pain, opioids have high risks of addiction and dependence. While other pain relief options should be explored before taking opioids, when taken as prescribed for short periods of time, opioid pain relievers may generally be safe for most adults. But because opioid pain relievers (which have the same properties as heroin) can produce a sensation of euphoria in addition to pain relief, some people take them for longer stretches and increase the dosage over time – which can lead to addiction.

What can I do, right now, to keep my family safe?

  1. Ask your doctor about alternatives to opioids to manage pain.
  2. Secure all of the medication in your home.
  3. Make sure that medications for you and your loved ones are used only as prescribed, and not shared with anyone else,
  4. Dispose of unused or expired medications at a Take Back location this weekend. Enter your zip code or address into the map here and find a local take-back facility.

But my child isn’t using opioid drugs – why do I need to clean my medicine cabinet?

When surveyed, more than half of teens say that it’s easy to get prescription drugs from their parent’s medicine cabinet, and two-thirds of teens who report misusing Rx medication get it from friends, family and acquaintances. While it’s tempting to keep old prescriptions around “in case you need them later,” it’s safer to dispose of them when the immediate need is over. Proper medication storage and disposal can help prevent misuse even beyond your own family.

How can I talk to my child about drug misuse?

While a majority of kids report that their parents have talked to them about avoiding alcohol (81%) or marijuana (80%), only 18% of kids say that their parents have talked to them about prescription drug use. Kids who learn about the dangers of drug use early and often are much less likely to develop addiction than those who do not receive these important messages at home. Conversations about the importance of using medications as prescribed, including not sharing medications or taking anything that hasn’t been prescribed to oneself, are critical messages to convey. Learn more tips for talking about medication misuse.

What signs should I be on the lookout for?

Signals range from the obvious, like missing prescriptions and empty pill bottles, to subtler signs like sudden mood changes, isolating from family or friends, and losing interest in hobbies that used to bring joy. Early use can sometimes bring about positive behavior and moods, like being overly motivated or having lively conversations.

Opioid addiction can also manifest in physical ways: Look for signs of fatigue and drowsiness, pinpoint pupils and dark circles under the eyes, and rapid weight loss. Learn more about opioid medication, including common signs of misuse.  

What do I do if I find out my child is misusing or abusing opioids?

It can be scary to learn that your child is misusing opioids, but there are steps you can take to help:

  • Learn about tools to help motivate your child to get treatment.
  • Start a conversation, not a confrontation, and always remember to listen.
  • Consider your treatment options, including medications that can help reduce cravings associated with opioids.
  • As a safety precaution, you can talk to your doctor or pharmacist about getting Naloxone (known by the brand name Narcan) which is used to reverse an opioid overdose.

When I found out my child was misusing opioids, I was scared and felt alone—and felt like I had nowhere to turn. But parents and families don’t have to face this alone. Compassionate, one-on-one support and guidance are within reach. You can connect with a Helpline Specialist at the Partnership for Drug-Free Kids by calling 1-855-DRUGFREE. You can also contact us by text (send a message to 55753) or email at our website at drugfree.org.

If you are an adult who is personally struggling with addiction, or you’d like information on how to help a loved one, you can find opioid addiction resources through the Federal SAMHSA Helpline: 1-800-662-HELP. 

Coming together to create a prior art archive

Patent quality is a two-way street. Patent applicants should submit detailed disclosures describing their inventions and actively participate in the examination process to define clear distinctions between their inventions and existing technology. Examiners reviewing patent applications should conduct thorough searches of existing technology, reject any attempts to patent existing technology, and develop a clear record of the differences between the patent claims and what came before. The more that the patent system supports and incentivizes these activities, the more reliable the rights that issue from patent offices will be, and the more those patents will promote innovation.

A healthy patent system requires that patent applicants and examiners be able to find and access the best documentation of state-of-the-art technology. This documentation is often found in sources other than patents. Non-patent literature can be particularly hard to find and access in the software field, where it may take the form of user manuals, technical specifications, or product marketing materials. Without access to this information, patent offices may issue patents covering existing technology, or not recognize trivial extensions of published research, removing the public’s right to use it and bringing the reliability of patent rights into question.

To address this problem, academia and industry have worked together to launch the Prior Art Archive, created through a collaboration between the MIT Media Lab, Cisco and the USPTO, and hosted by MIT. The Prior Art Archive is a new, open access system that allows anyone to upload those hard-to-find technical materials and make them easily searchable by everyone.

We’re proud to support the Prior Art Archive, and have devoted significant resources to this and other important quality initiatives. The Prior Art Archive is searchable through Google Patents,  and all of the documents in the Archive have been labeled with Cooperative Patent Classification codes using Google’s machine learning models. The labels are a feature we rolled out in Google Patents to help make the most relevant technical materials easier to find. We’ve also recently launched a site accessible to the public and examiners, TDCommons, where companies can publish technical information they don’t want to patent free of charge.  

We’re also excited to use AI and machine learning to take prior art searching to the next level. To this end, we’ve recently created an open ecosystem, the Google Patents Public Datasets, to make large datasets available for empirical public policy, economics, and machine learning research. We’re committed to developing and making available technology that improves patent quality, and ultimately strengthens our patent system.

Proposing a framework for data protection legislation

For nearly two decades, people around the world have used Google to find answers, communicate, build businesses, and more. Our users have long entrusted us to be responsible with their data and we take that trust and responsibility very seriously.

Our investment in privacy and security is evident in every product we build, including the powerful tools we provide to help our users make decisions about their data like the Privacy Checkup. Google products and features cannot launch until they are approved by the specialists in our Privacy and Data Protection Office, which solicits input from across Google, as well as periodically from users and experts worldwide. And our broad commitment to transparency is evident in our newly-refreshed Privacy Policy, which includes informative videos that explain our practices and settings, as well as tools like My Activity that provide detailed information about the data in a user’s Google Account and options for how to control it. Since 2010, our Transparency Report has provided information on how the policies and actions of governments and corporations affect privacy, security, and access to information.

I’m proud of the work we do at Google. That’s why, after almost a decade leading Google's privacy legal team, I've recently agreed to take on the role of Chief Privacy Officer. In this role, I set the priorities for the privacy program at Google, including continually challenging ourselves to make sure our privacy and security tools, policies, and practices are as user-focused as every other aspect of our business. My team’s goal is to help you enjoy the benefits of technology, while remaining in control of your privacy.

This is an important time to take on this new role. Now, more than any other time I have worked in this field, there is real momentum to develop baseline rules of the road for data protection. Google welcomes this and supports comprehensive, baseline privacy regulation. People deserve to feel comfortable that all entities that use personal information will be held accountable for protecting it. And we believe that regulation can support a dynamic marketplace for businesses of all types and sizes.

Today, we’re sharing our view on the requirements, scope, and enforcement expectations that should be reflected in all responsible data protection laws. This framework is based on established privacy frameworks, as well as our experience providing services that rely on personal data and our work to comply with evolving data protection laws around the world. These principles help us evaluate new legislative proposals and advocate for responsible, interoperable and adaptable data protection regulations. How these principles are put into practice will shape the nature and direction of innovation. You can find more detail in this PDF.

Sound practices combined with strong and balanced regulations can help provide individuals with confidence that they’re in control of their personal information. I look forward to discussing these principles and Google’s work on privacy and security with the U.S. Senate later this week, and to working with policymakers and all stakeholders on regulation that protects consumers and enables innovation.

Introducing a new transparency report for political ads

We first launched our Transparency Report in 2010 with the goal of fostering important conversations about the relationship between governments, companies, and the free flow of information on the internet.

Over the years, we’ve evolved the report, adding sections about content removed from Google Search due to European privacy laws, adoption of encryption on websites (HTTPS), and more. And today, we’re adding another new section to our Transparency Report: Political Advertising on Google.

Earlier this year, we took important steps to increase transparency in political advertising. We implemented new requirements for any advertiser purchasing election ads on Google in the U.S.—these advertisers now have to provide a government-issued ID and other key information that confirms they are a U.S. citizen or lawful permanent resident, as required by law. We also required that election ads incorporate a clear “paid for by” disclosure. Now, we’re continuing to roll out new transparency features with the addition of the political advertising report as well as a new political Ad Library.

political advertising transparency report locations

The new political advertising section in our Transparency Report shows how much money is spent across states and congressional districts

The new political advertising report shows who buys federal election ads in the U.S., how much money is spent across states and congressional districts on such ads, and who the top advertisers are overall. We designed this report for anyone interested in transparency—the information is searchable and downloadable, so that you can easily access and sort through the data. We’re updating the report every week, so as we head into election season, anyone can see new ads that get uploaded or new advertisers that decide to run Google ads.

political advertising transparency report top keywords

The new political advertising section in our Transparency Report

Meanwhile, our new, searchable election Ad Library shows things like which ads had the highest views, what the latest election ads running on our platform are, and deep dives into specific advertisers’ campaigns. In addition, the data from the report and Ad Library is publicly available on Google Cloud’s BigQuery. Using BigQuery’s API, anyone can write code and run their own unique queries on this data set. Researchers, political watchdog groups and private citizens can use our data set to develop charts, graphs, tables or other visualizations of political advertising on Google Ads services. Together with the Transparency Report, we hope this provides unprecedented, data-driven insights into election ads on our platform.

Even though the political advertising report and Ad Library provide many new insights, we know there is more work to be done. We’re working with experts in the U.S. and around the world to explore tools that capture a wider range of political ads—including ads about political issues (beyond just candidate ads), state and local election ads, and political ads in other countries. We’re also continuing to share our Protect Your Election tools to safeguard campaigns from digital attacks. As we approach the 2018 midterm elections in the U.S, we’ve introduced new tools to help protect political campaigns, provide voters with accurate information, and increase transparency on our platforms, and we’ll continue to do more.