Tag Archives: Data Centers and Infrastructure

How we’re minimizing AI’s carbon footprint

A photograph of a textbook about computer architecture.

The book that led to my visit to Google.

When I first visited Google back in 2002, I was a computer science professor at UC Berkeley. My colleague John Hennessey and I were updating our textbook on computer architecture, and Larry Page — who rode a hot-rodded electric scooter at the time — agreed to show me how his then three-year-old company designed its computing for Search. I remember the setup was lean yet powerful: just 6,000 low-cost PC servers and 12,000 PC disks answering 70 million queries around the world, every day. It was my first real look at how Google built its computer systems from the ground up, optimizing for efficiency at every level.

When I joined the company in 2016, it was with the goal of helping research how to maximize the efficiency of computer systems built specifically for artificial intelligence. Last year, Google set an ambitious goal of operating on 24/7 carbon-free energy, everywhere, by the end of the decade. But at the same time, machine learning systems are quickly becoming larger and more capable. What will be the environmental impact of those systems — and how can we neutralize that impact going forward? 

Today, we’re publishing a detailed analysis that addresses both of those questions. It’s an account of the energy- and carbon-costs of training six state-of-the art ML models, including five of our own. (Training a model is like building infrastructure: You spend the energy to train the model once, after which it’s used and reused many times, possibly by hundreds of millions of people.) To our knowledge, it’s the most thorough evaluation of its kind yet published. And while we had reason to believe our systems were efficient, we were encouraged by just how efficient they turned out to be.

For instance, we found that developing the Evolved Transformer model, a more efficient version of the popular Transformer architecture for ML, emitted nearly 100 times less carbon dioxide equivalent than a widely cited estimate. Of the roughly 12.7 terawatt-hours of electricity that Google uses every year, less than 1/200th of a percent of it was spent training our most computationally demanding models.  

What’s more, our analysis found that there already exist many ways to develop and train ML systems even more efficiently: Specially designed models, processors and data centers can dramatically reduce energy requirements, while the right selection of energy sources can go a long way to reduce the carbon that’s emitted during training. In fact, the right combination of model, processor, data center and energy source can reduce the carbon footprint of training an ML system by 1000 times. 

There’s no one easy trick for achieving a reduction that large, so let’s unpack that figure.  Minimizing a system’s carbon footprint is a two-part problem: First you want to minimize the energy the system consumes, then you have to supply that energy from the cleanest source possible.

Our analysis took a closer look at GShard and Switch Transformer, two models recently developed at Google Research. They’re the largest models we’ve ever created, but they both use a technique called sparse activation that enables them to only use a small fraction of their total architecture for a given task. It’s a bit like how your brain uses a small fraction of its 100 billion neurons to help you read this sentence. The result is that these sparse models consume less than one tenth the energy that you’d expect of similarly sized dense models — without sacrificing accuracy.

But to minimize ML’s energy use, you need more than just efficient models — you also need efficient processors and data centers to train and serve them. Google’s Tensor Processing Units (TPUs) are specifically designed for machine learning, which makes them up to five times more efficient than off-the-shelf processors. And the cloud computing data centers that house those TPUs are up to twice as efficient as typical enterprise data centers. 

Once you’ve minimized your energy requirements, you have to think about where that energy originates. The electricity a data center consumes is determined by the grid where it’s located. And depending on what resources were used to generate the electricity on that grid, this may emit carbon. 

The carbon intensity of grids varies greatly across regions, so it really matters where models are trained. For instance, the mix of energy supplying Google’s Iowa data center produces 0.080kg of CO2e per kilowatt hour of electricity, when combining the electricity supplied by the grid and produced by Google’s wind farms in Iowa. That’s 5.4 times less than the U.S. average. 

Any one of these four factors — models, chips, data centers and energy sources — can have a sizable effect on the costs associated with developing an ML system. But their cumulative impact can be enormous.

When John and I updated our textbook with what we’d learned on our visit to Google back in 2002, we wrote that “reducing the power per PC [server]” presented “a major opportunity for the future.” Nearly 20 years later, Google has found many opportunities to streamline its systems — but plenty remain to be seized. As a result of our analysis, we’ve already begun shifting where we train our computationally intensive ML models. We’re optimizing data center efficiency by shifting compute tasks to times when low-carbon power sources are most plentiful. Our Oklahoma data center, in addition to receiving its energy from cleaner sources, will house many of our next generation of TPUs, which are even more efficient than their predecessors. And sparse activation is just one example of the algorithmic ingenuity Google is using to design ML models that work smarter, not harder.

Cleaner data centers, batteries included

On the rare occasions when a Google data center is affected by a power outage, we have to be ready to ramp up millions of watts of backup electricity in seconds. This is a daunting challenge, which our industry has typically met using diesel generators. But now we’re aiming to demonstrate that a better, cleaner solution has advanced far enough to keep the internet up and running. 

In Belgium, we’ll soon install the first ever battery-based system for replacing generators at a hyperscale data center. In the event of a power disruption, the system will help keep our users’ searches, e-mails, and videos on the move—without the pollution associated with burning diesel. 

But even more important is what will happen when Google doesn’t need emergency power. Whereas diesel generators sit idle most of the year, batteries are multi-talented team players: when we’re not using them, they’ll be available as an asset that strengthens the broader electric grid. 

Worldwide, we estimate there are over 20 gigawatts of backup diesel generators in service across the data center industry, representing a massive opportunity to deploy cleaner solutions. Our project in Belgium is a first step that we hope will lay the groundwork for a big vision: a world in which backup systems at data centers go from climate change problems to critical components in carbon-free energy systems. 

How data centers can anchor carbon-free electric grids

Wind and solar power are currently booming around the world, but sunny days and breezy hours don’t always align with a community's energy demand. Large-scale batteries at data centers can address this problem by banking renewable power when it’s abundant, and discharging it when it’s needed. Batteries can also help balance other kinds of variability on power grids, allowing for more cost-effective and efficient operations. Working in partnership with ELIA, the local transmission system operator in Belgium, we’ll strive to make our project a model for how data centers can become anchors for carbon-free electric grids.

Gif demonstrating data center energy storage

In fact, one reason we chose Belgium as the site for our project is because the local team already has a track record of implementing novel energy ideas. It was the first facility in our global fleet to run entirely without mechanical chillers—one of many reasons that the European Commission recognized it as a top performer for energy efficiency. It’s also the place where we’ve integrated our largest on-site renewable energy installation—more than 10,000 solar panels strong.

Toward a carbon-free world

We’ve been working for years to push Google toward a zero-carbon future--from our achievement of carbon neutrality since 2007, to reaching 100 percent renewable energy every year since 2017, and now pursuing our most ambitious goal yet, 24/7 carbon-free electricity by 2030. Our new battery project will help us operate more cleanly when the power grid goes down, and help the grid itself move towards a carbon-free future.

You can hear more about our broader efforts in Episode 4 of Google’s just-released podcast, "Where the Internet Lives," which gives an inside look at how data centers can lead on clean energy in a world confronting climate change.

Our data centers support Europe’s green economic recovery

In 2020, families, schools and businesses moved online more than ever due to the pandemic. All the Google services you rely on are powered by our data centres, and we’ve had to ensure this infrastructure works for everyone as demand increased—for businesses using Google Cloud and Google Meet, and for anyone who asks a question on Search, watches a YouTube video, or uses Google Maps to get from A to B. 

In the last few weeks, we’ve added new infrastructure to Europe that supports the continent’s digital growth. Last month in Hamina, Finland, we were delighted to welcome Prime Minister Sanna Marin as she visited the construction site of our sixth data center building. Last week, we opened a new data center in Denmark in Fredericia. And just this week in the Netherlands, our second Dutch data center started its operation in Middenmeer.

A European green transition, powered by sustainable infrastructure

We’re proud that our data centers operate the cleanest cloud in the industry. They are on average twice as energy efficient as a typical enterprise data center. Compared to five years ago, we now deliver around seven times as much computing power with the same amount of electrical power. 

Last week Europe announced its ambitious55 percent reduction target for CO2 emissions by 2030, in addition to its 32 percent renewable energy target. Google is helping to accelerate this transition, having supported nearly 1,700 megawatts of new renewable energy projects in Belgium, Sweden, Denmark, Finland and the Netherlands. And we are committed to supporting the EU Climate Pact, as technology will have a critical role to play in making the EU Green Deal vision a reality.

Taking the world’s greenest data center fleet to the next level

Our AI technology helps reduce the energy we use to cool our data centers by 30 percent, and we make it available for use by airports, shopping malls, hospitals, data centers and other commercial buildings and industrial facilities. 
But we’re not stopping there. A few months ago, we announced our Third Decade of Climate Action: an ambitious plan to help build a carbon-free future and operate on clean energy around the clock. This is far more challenging than the traditional approach of matching energy usage with renewable energy, but we’re working to get this done in the next nine years.

Contributing to European growth with our (new) data centers

In addition to enabling the greenest, cleanest cloud, all these sites bring economic growth and employment to local communities and to Europe. In Finland, our data center has brought €1.2 billion in investment and supported 1700 jobs every year since 2009. During construction of our Denmark data center, we spent over €600 million and supported 2600 jobs. And in the Netherlands, we’ve directly invested €2.5 billion since 2014.

In the next five years, we expect to anchor €2 billion in new carbon-free energy generation projects and green infrastructure in Europe, helping to develop new technologies to make round-the-clock carbon-free energy cheaper and more widely available. 

Investing in our local communities

Partnerships at the local level make all the difference to communities. We have long worked with local NGOs in our data center communities and have donated millions to important initiatives in Europe, including skills training in cooperation with local colleges and universities. 

We have supported multiple education programmes focused on STEM (science, technology, engineering and maths), as well as environmental and cultural projects. For example, in Denmark we recently supported two projects with the Museum Fredericia that will promote local history through virtual experiences. In the Netherlands, we’ve helped with the preservation of local bee and butterfly populations. And in Ireland, during COVID-19, we’ve assisted vulnerable communities, and have given grants to local schools to provide students with laptops and enable home schooling.

We are proud to invest in Europe’s digital infrastructure, contribute to the local communities we operate in and support Europe’s green transition. This will be a decisive decade, and we are committed to leading by example.

A new podcast explores the unseen world of data centers

Do you ever wonder where it all comes from? The words you’re reading right now, the music you stream or the program your kids use to do their homework? All that stuff can’t be just floating around in space ... can it? The internet has to live somewhere, right? 

Right. Every click you make online reaches across vast distances to retrieve information from racks of powerful computers inside some of the most secure buildings in the world. And then whatever you’re seeking appears in an instant. Even for the people who keep the machines running, the process feels like nothing short of magic. These buildings—where the Internet lives—are called data centers. Each data center exists in a real place, operated by real people in communities like Bridgeport, Alabama and Changhua County, Taiwan.

An animated GIF showing the logo of Where the Internet Lives.

Even at Google, only about one percent of employees ever get to set foot inside a data center. So to demystify these warehouse-scale computing facilities, a small team of Googlers and I spent the last year exploring them. Through the process, we got to know the people who design, build, operate and secure these buildings. We connected with outside experts and community members whose lives intersect with this infrastructure that keeps the digital economy moving. And today, we’re releasing the result of all this work: a new six-episode podcast called Where the Internet Lives.

As you listen, you’ll get a rare glimpse behind the walls and through multiple layers of security, literally going inside the machines that power the internet, guided by the people who keep them humming.

Along the way, you’ll learn how data centers work, what they mean to the communities that host them, the reasons data centers are some of the most secure buildings in the world and how efforts to operate data centers on 24/7 clean energy are transforming electrical grids across the globe.

Subscribe to the podcast now to be transported—at nearly the speed of light—to Where the Internet Lives. 

Click through the images below to read episode descriptions and take a peek at the engineering marvels that are today’s data centers.

One percent of Googlers get to visit a data center, but I did

For years I’ve wondered what it’s like behind the protected walls of a Google data center, and I’m not alone. In my job at Google, I spend my days working with developers. Our data centers are crucial to the work that they do, but most have never actually set foot inside a data center. And until recently, neither had I. I went on a mission to find answers to common questions like: Why are visits so tightly restricted? How secure is a Google data center? How do we meet regulatory requirements? Here's what I found out.

To keep our customers' data safe, we need to make sure the physical structure of the data center is absolutely secure. Each data center is protected with six layers of physical security designed to thwart unauthorized access. Watch the video above to follow my journey through these layers to the core of a data center, and read on to learn even more.

“Least privilege” is the rule to live by

badge swipe

There are two rules strictly enforced at all Google data centers. The “least privilege” protocol is the idea that someone should have only the bare minimum privileges necessary to perform their job. If your least privilege is to enter Layer 2, you won’t have luck moving to Layer 3. Each person’s access permissions are checked at badge readers that exist at every access point in a data center facility. Authorization measures happen everywhere using this protocol. 

Another rule exists that prevents a vehicle or individual closely following another to gain entry into a restricted area without a badge swipe. If the system detects a door open for too long, it immediately alerts security personnel. Any gate or door must close before the next vehicle or person can badge in and gain access.

Two security checks: badge first, then circle lock

circle lock

You’ve probably seen dual-authentication when you try to sign into an account and a one-time password is sent to your phone. We take a similar approach at the data centers to verify a person’s identity and access. At some layers in the data center, you’re required to swipe your badge, then enter a circle lock, or tubular doorway. You walk into a special "half portal" that checks your badge and scans your eyes to gain access to the next layer of the data center. It prevents tailgating because only one person is allowed in the circle lock at a time.

Shipments are received through a secure loading dock

The facility loading docks are a special section of Layer 3, used to receive and send shipments of materials, such as new hardware. Truck deliveries must be approved for access to Layer 3 to enter the dock. For further security, the loading dock room is physically isolated from the rest of the data center, and guard presence is required when a shipment is received or sent.

All hard drives are meticulously tracked

hard drive

Hard drive tracking is important to the security of your data because hard drives contain encrypted sensitive information. Google meticulously tracks the location and status of every hard drive within our data centers—from acquisition to destruction—using barcodes and asset tags. These asset tags are scanned throughout a hard drive's lifecycle in a data center from the time it’s installed to the time it's removed from circulation. Tracking hard drives closely ensures they don’t go missing or end up in the wrong hands.

We also make sure hard drives are properly functioning by doing frequent performance tests. If a component fails to pass a performance test, it’s deemed no longer usable. To prevent any sensitive information from living on that disk, we remove it from inventory to be erased and destroyed in Layer 6, Disk Erase. There, the disk erase formatter uses a multi-step process that wipes the disk data and replaces each bit of data with zeros. If the drive can’t be erased for any reason, it’s stored securely until it can be physically destroyed. 

Layered security extends into the tech itself

Our layered security approach isn’t just a physical safeguard for entering our data centers. It’s also how we protect the hardware and software that live in our data centers. At the deepest layer, most of our server boards and networking equipment are custom-designed by Google. For example, we design chips, such as the Titan hardware security chip, to securely identify and authenticate legitimate Google hardware. 

At the storage layer, data is encrypted while it travels in and out of the data center and when it’s stored at the data center. This means whether data is traveling over the internet moving between Google’s facilities, or stored on our servers, it’s protected. Google Cloud customers can even supply their own encryption keys and manage them in a third-party key management system deployed outside Google’s infrastructure. This defense-in-depth approach helps to expand our ability to mitigate potential vulnerabilities at every point

To learn more about our global data centers, visit our Data and Security page. We will also be sharing more about our security best practices during the upcoming Google Cloud Next ’20: OnAir event.

Our data centers now work harder when the sun shines and wind blows

Addressing the challenge of climate change demands a transformation in how the world produces and uses energy. Google has been carbon neutral since 2007, and 2019 marks the third year in a row that we’ve matched our energy usage with 100 percent renewable energy purchases. Now, we’re working toward 24x7 carbon-free energy everywhere we have data centers, which deliver our products to billions of people around the world. To achieve 24x7 carbon-free energy, our data centers need to work more closely with carbon-free energy sources like solar and wind. 

New carbon-intelligent computing platform

Our latest advancement in sustainability, developed by a small team of engineers, is a new carbon-intelligent computing platform. We designed and deployed this first-of-its kind system for our hyperscale (meaning very large) data centers to shift the timing of many compute tasks to when low-carbon power sources, like wind and solar, are most plentiful. This is done without additional computer hardware and without impacting the performance of Google services like Search, Maps and YouTube that people rely on around the clock. Shifting the timing of non-urgent compute tasks—like creating new filter features on Google Photos, YouTube video processing, or adding new words to Google Translate—helps reduce the electrical grid’s carbon footprint, getting us closer to 24x7 carbon-free energy.
Low carbon energy graphic

Visualization of how we shift compute tasks to different times of day to align with the availability of lower-carbon energy. In this illustration, wind energy in the evening and solar energy during the day.

Each day, at every Google data center, our carbon-intelligent platform compares two types of forecasts for the following day. One of the forecasts, provided by our partner Tomorrow, predicts how the average hourly carbon intensity of the local electrical grid will change over the course of a day. A complementary Google internal forecast predicts the hourly power resources that a data center needs to carry out its compute tasks during the same period. Then, we use the two forecasts to optimize hour-by-hour guidelines to align compute tasks with times of low-carbon electricity supply. Early results demonstrate carbon-aware load shifting works. Results from our pilot suggest that by shifting compute jobs we can increase the amount of lower-carbon energy we consume. 

Baseline vs carbon-aware load

Data from our pilot illustrates how the new system shifts compute from our baseline (dashed line) to better align with less carbon-intensive times of the day—such as early morning and late evening (solid line)—when wind energy is most plentiful. Gray shading represents times of day when more carbon-intensive energy is present on the grid.

What’s next

The first version of this carbon-intelligent computing platform focuses on shifting tasks to different times of the day, within the same data center. But, it’s also possible to move flexible compute tasks between different data centers, so that more work is completed when and where doing so is more environmentally friendly. Our plan for the future is to shift load in both time and location to maximize the reduction in grid-level CO2 emissions. Our methodology, including performance results of our global rollout, will be shared in upcoming research publications. We hope that our findings inspire other organizations to deploy their own versions of a carbon-intelligent platform, and together, we can continue to encourage the growth of carbon-free electricity worldwide. Learn more about Google’s progress toward a carbon-free future on our Sustainability site.

Keeping our network infrastructure strong amid COVID-19

Google's network supports products that people around the world rely on every day, like YouTube, Search, Maps and Gmail. It also connects Google Cloud customers to their employees and users. As the coronavirus pandemic spreads and more people move to working or learning from home, it’s natural to wonder whether the Google network can handle the load. The short answer is yes. 

We’ve designed our network to perform during times of high demand. The same systems we built to handle peaks like the Cyber Monday online shopping surge, or to stream the World Cup finals, support increased traffic as people turn to Google to find news, connect with others, and get work done during this pandemic. And while we’re seeing more usage for products like Hangouts Meet, and different usage patterns in products like YouTube, peak traffic levels are well within our ability to handle the load. 

Google’s network consists of a system of high-capacity fiber optic cables that encircle the globe, under both land and sea, connecting our data centers to each other, and to you. Traffic flows over our dedicated network, optimized for speed and reliability until we hand it off to more than 3,000 internet service providers (ISPs) in 200+ countries and territories for local delivery—the “last mile”—using hundreds of points of presence and thousands of edge locations around the world.

Handling traffic on Google’s infrastructure and bringing it close to people helps limit the burden on operators—whose networks have different levels of reserve capacity—to allow them to focus on delivering that last mile. Together, we work to provide the best possible experience for browsing, video-conferencing, streaming, making purchases online, and more to people around the world. We’re continuing to work with governments and network operators around the globe as we do our part to minimize stress on the system. As part of this, we recently announced that we are temporarily defaulting all videos on YouTube to standard definition.  

We also recognize the importance of Google services at a time like this and continue to add capacity to stay ahead of demand. Our dedicated global network deployment and operations team is increasing capacity wherever needed, and, in the event of a disruption, recovers service as quickly as possible. 

This may be a time of global uncertainty, but we're working hard to ensure the Google network is there for everyone, business or consumer, day and night.

Data centers are more energy efficient than ever

While Google is the world’s largest corporate purchaser of renewable energy, we’re also taking action on climate change by minimizing the amount of energy we need to use in the first place. For more than a decade, we’ve worked to make our data centers as energy efficient as possible. Today, a new paper in Science validated our efforts and those of other leaders in our industry. It found that efficiency improvements have kept energy usage almost flat across the globe’s data centers—even as demand for cloud computing has skyrocketed.

The new study shows that while the amount of computing done in data centers increased by about 550 percent between 2010 and 2018, the amount of energy consumed by data centers only grew by six percent during the same time period. The study’s authors note that these energy efficiency gains outpaced anything seen in other major sectors of the economy. As a result, while data centers now power more applications for more people than ever before, they still account for about 1 percent of global electricity consumption—the same proportion as in 2010. 

What's more, research has consistently shown that hyperscale (meaning very large) data centers are far more energy efficient than smaller, local servers. That means that a person or company can immediately reduce the energy consumption associated with their computing simply by switching to cloud-based software. As the data center industry continues to evolve its operations, this efficiency gap between local computing and cloud computing will continue to grow.

Searching for efficiency

How are data centers squeezing more work out of every electron, year after year? For Google, the answer comes down to a relentless quest to eliminate waste, at every level of our operations. We designed highly efficient Tensor Processing Units, (the AI chips behind our advances in machine learning), and outfitted all of our data centers with high-performance servers. Starting in 2014, we even began using machine learning to automatically optimize cooling in our data centers. At the same time, we’ve deployed smart temperature, lighting, and cooling controls to further reduce the energy used at our data centers. 

Our efforts have yielded promising results: Today, on average, a Google data center is twice as energy efficient as a typical enterprise data center. And compared with five years ago, we now deliver around seven times as much computing power with the same amount of electrical power. 

By directly controlling data center cooling, our AI-powered recommendation system is already delivering consistent energy savings of around 30 percent on average. And the average annual power usage effectiveness for our global fleet of data centers in 2019 hit a new record low of 1.10, compared with the industry average of 1.67—meaning that Google data centers use about six times less overhead energy for every unit of IT equipment energy.

Leading by example

So where do we go from here? We’ll continue to deploy new technologies and share the lessons we learn in the process, design the most efficient data centers possible, and disclose data on our progress. To learn about our efforts to power the internet using as little power as possible—and how we’re ensuring that the energy we use is carbon-free, around the clock—check out our latest Environment Report or visit our data center efficiency site.

Continuing to grow and invest across America in 2020

Today I’m pleased to announce that Google will invest more than $10 billion in offices and data centers across the United States in 2020. 

Google has a presence in 26 states across the country and our new investments will be focused in 11 of them: Colorado, Georgia, Massachusetts, Nebraska, New York, Oklahoma, Ohio, Pennsylvania, Texas, Washington and California. 

Everywhere we invest, we strive to create meaningful opportunities for local communities. A powerful example is our data center in Pryor, a town in Mayes County, Oklahoma. Last year, I visited Pryor to announce a $600 million investment, our fourth expansion there since 2007. It felt like the whole community came out to welcome us, from small business owners to teachers to Google employees. Pryor Mayor Larry Lees told the crowd that Google’s investments have helped provide local schools with the resources they need—including the latest textbooks and STEM courses—to offer a world-class education. He talked about the small businesses we have helped train and the mentorship Googlers have provided to Pryor’s students. 

This is exactly the kind of difference we hope to make with our new office and data center projects in 2020. These investments will create thousands of jobs—including roles within Google, construction jobs in data centers and renewable energy facilities, and opportunities in local businesses in surrounding towns and communities. 

This effort builds on the momentum of the $13 billion investment in communities from South Carolina to Nevada we made in 2019. Combined with other R&D investments, Google’s parent company Alphabet was the largest investor in the U.S. last year, according to a reportfrom the Progressive Policy Institute.  

We look forward to continuing this progress in the year ahead. Here’s a look at our 2020 investments by region:

2020 investments by region


Google continues to invest in Atlanta, and we will be welcoming new engineering teams to our growing office there this year. We will also invest in expanded offices and data centers in Texas, Alabama, South Carolina, Virginia and Tennessee. Plus, we’ll open a Google Operations Center in Mississippi to improve our customer support for users and partners around the world. 

Breaking ground at new office development in Atlanta, in 2019

Breaking ground at our office development in Atlanta in 2019. We’re expanding our space in Atlanta this year.


We recently opened a new Google Cloud office in Chicago and expanded our office in Madison, Wisconsin. We’ll make additional investments in our offices in Detroit, open a new data center in Ohio, and complete the expansion of our data center in Iowa.

Ribbon cutting at our new Google Cloud office in Chicago, Ill., in 2019.

Ribbon cutting at our new Google Cloud office in Chicago in 2019.


In Colorado, we have the capacity to double our workforce over the next few years, in part by expanding our presence in Boulder. We’ll also invest further in growing data centers in Nebraska and Oklahoma. 

Sundar Pichai speaking at Google’s Mayes County, Okla., data center expansion event.

Google’s Mayes County, Oklahoma data center expansion event. 


We’re opening our new Hudson Square campus in New York City, where we have the capacity to double our local workforce by 2028. We’re also expanding our office in Pittsburgh, and a bigger office in Cambridge, Massachusetts, is under development. 


We are expanding our Google Cloud campus in Seattle and undertaking a major development in Kirkland to open later this year. We’re making office and data center investments in Oregon. In California, we continue to invest in new locations in the Bay Area and Los Angeles. 

We’ll also accelerate our work with businesses, governments, and community organizations to distribute the $1 billion we committed for Bay Area housing. In the first six months of this commitment, we’ve helped to create more than 380 new affordable housing units in the Bay Area, including an investment in a development focused on affordable and inclusive housing for adults with disabilities. There’s more to come in 2020.

In addition to these investments in infrastructure and jobs, we’ll also continue our work nationally with local startups, entrepreneurs and small business owners to help Americans access new digital opportunities. Already Grow with Google and Google for Startups have trained more than 4 million Americans in hundreds of communities across all 50 states. Looking ahead, we're especially excited about our work creating pathways to jobs in the fast-growing field of IT through our two Grow with Google certificate programs

Our growth is made possible only with the help of our local Googlers, partners and communities who have welcomed Google with open arms. Working together, we will continue to grow our economy, create good jobs for more Americans and make sure everyone can access the opportunities that technology creates.

Bringing Wi-Fi to the residents of Celilo Village

For the past seven years, I have spent time visiting students in rural communities across Washington State, where I live. I share information about science, engineering, technology and math, and specifically talk about software engineering and the projects Google has launched. It’s a true joy of mine to see students excited about technology, and see their young minds thinking about the possibilities ahead of them. 

When I visit students, I get to combine my experience as an engineer at Google, and as a member of the Google American Indian Network, to bring access to technology to those who may not otherwise have it. As an Elder and an Enrolled Member of the Confederated Tribes of Siletz Oregon, I was honored to take part in Google’s latest initiative to bring Wi-Fi and Chromebooks to Celilo Village, a Native American community on the Columbia River. This project will give residents and students the ability to access the abundance of information found online, and improve the digital divide between urban and rural communities.

The village has a historical significance to this part of the country, dating back over 11,000 years. Today, it’s home to nearly 100 Native Americans from many tribes, four of whom are the Confederated Tribes of Warm Springs, Confederated Tribes of Yakama, Confederated Tribes of Umatilla and the Nez Perce Tribe. And until now, the 16 homes in the village had sporadic or no access to Wi-Fi.

Celilo Village schoolhouse

Distributing Chromebooks to village residents in their renovated schoolhouse.

Thanks to a grant from Google, participation from the Google American Indian Network and collaboration with Dufur School, village residents and The Dalles Data Center, all homes now have access to Wi-Fi, and so do their schoolhouse and longhouse. Residents will have access to Chromebooks, and I put together a booklet with instructions on getting online and accessing Google apps.

Daydream VR in Celilo Village

Karen Whitford, a resident and Elder of Celilo Village, tries out the Google Daydream View VR headset.

The idea for the partnership came from Celilo Village resident Bobby Begay, who talked to the Columbia Gorge Discovery Center about funding connectivity for the village. The Discovery Center then worked with Googlers across the company to get the project started, including the Google American Indian Network. We celebrated this special gift with a community event in Celilo Village over the weekend, where we were joined by tribal leaders, policymakers and community members.

My fellow Googlers and I worked directly with the community to get this done, and we plan to keep our partnership going. “I’m excited to see the project come to fruition, but I think even more I’m excited at the opportunity to foster a longer-term relationship with residents of Celilo,” says my colleague Tria Bullard, one of the first Googlers to get involved with the project. We plan to provide more trainings and other computer science-related activities in the future. 

My hope is that with this new window into technology, Celilo Village will continue to grow and thrive for years to come. And who knows: Maybe kids growing up there will become part of the next generation of scientists and engineers.