Tag Archives: Data Centers and Infrastructure

Our new animated series brings data centers to life

If you rely on the internet to search for the answer to a burning question, access work documents or stream your favorite TV show, you may have wondered how you can get the content you want so easily and quickly. You can thank a data center for that. 

Which may make you wonder: What exactly is a data center, and what is its purpose?

Google’s Discovering Data Centers series of short animated videos has the answers. As host of this series, I invite you to join us and learn about these expansive, supercomputer-filled warehouses that we all rely on, yet may know little about.

A loop of an animated video showing a data center campus surrounded by trees, blue sky, power lines, and wind turbines. Three small bubbles appear over the data center with images in each: a computer server to represent storage, wires to represent the power supply, and a fan to represent the cooling infrastructure.

Each video in this series helps peel back the layers on what makes data centers so fascinating: design, technology, operations and sustainability. There are times you click Start on Google Maps, edit a Google Doc or watch a YouTube video on how to fix something. By watching this series, you’ll better understand how Google’s data centers get you and billions of other users like you to that content quickly, securely and sustainably. 

Discovering Data Centers will help you understand: 

  • How data centers play a critical role in organizing your and the world’s information.
  • Data center design and how data centers are built to be sustainable. 
  • Our core principles, which show you can depend on us to be available 24/7. 

As the second season of our series gets underway, upcoming topics include: 

  • How hundreds of machines at a data center store data.
  • How our network allows data to travel through and between data centers within seconds. 
  • How encryption of data works to help secure every packet of data stored in our data centers.

To watch this series and see how data centers benefit you, visit our website. Check back monthly for new episodes where I’ll continue to reveal all the layers that make a data center hum. 

Click through the images below to read episode descriptions and take a peek at the engineering marvels that are today’s data centers.


How Joy Jackson prepared for her Google interview

Welcome to the latest edition of “My Path to Google,” where we talk to Googlers, interns and alumni about how they got to Google, what their roles are like and even some tips on how to prepare for interviews.

Today’s post is all about Joy Jackson, a data center technician on the global server operations team, who shares how she went from studying to be a graphic designer to discovering a passion for IT and joining the Google data center team.

What’s your role at Google?

I am currently a data center technician on the Global Server Operations team, leading local projects as well as working with our team to deploy and maintain Google's advanced data center servers and network infrastructure. What I love most about my role is working with a diverse team and seeing how passionate each of us is to make sure that our network is up and running, ensuring users have the best and fastest experience possible.

What does your typical day look like right now?

A typical work day for me right now ranges from many different duties like physical deployments of the data center, maintaining servers and networking infrastructure and working closely with various partner teams to ensure our goals, missions and projects are successfully delivered.

Tell us about yourself?

I grew up in Charleston, South Carolina, and after graduating high school I left Charleston and went to The Art Institute of Charlotte, where I received my associate’s degree in graphic design. When I am not working, I like to spend my time on graphic projects and photography. Some of my hobbies outside of designing and photos are hiking, doing yoga and most importantly, traveling. I love to meet new people, explore new areas and learn about different cuisines and cultures. 

Can you tell us about your decision to apply to work at Google?

I was interested in Google because of how innovative the company is. I had never applied before and was intimidated because of how huge the company is. When I applied and heard back about interviews, I was extremely nervous because I did not think I would be a good fit due to being at the very early stages of my career.

Joy stands in front of a Google logo across a piece of wood cut in the shape of Virginia.

Joy works at one of Google’s Virginia data centers.

How would you describe your path to your current role at Google?

When I went off to college, I thought my heart was set on becoming a graphic designer and opening my own agency. But as I progressed in life and my career, I found myself more interested in working in IT. I worked hard to transition from what I thought I wanted to do to where I am now. And I am happy I did – I love the work we do. I have had opportunities to work in different data center locations and in different roles, just by learning new skills and opening myself up to reach out to other site locations and their teams.

What inspires you to come in every day?

I am inspired each day to come into work because of the millions of lives we are able to touch. It's just a great feeling knowing that, by the work we are doing, we are able to help so many people stay connected with friends and loved ones.

How did the recruitment process go for you?

I was referred to apply, and I was nervous about not being the right fit. But after my phone interview, I decided to stay open-minded about the process. Because I knew I could do the job and it was a perfect fit.

What's one thing you wish you could go back and tell yourself before applying? I wish I could go back to the moment before I applied and tell myself that it is okay to ask questions! I was so nervous and scared to ask any questions.

What resources did you find most helpful when preparing for the interview?

One of the resources I used to prepare for my interviews were sites like LinkedIn Learning, taking the time to do online courses and training classes and watching tutorials.

Do you have any tips you’d like to share with aspiring Googlers?

Never doubt your abilities to achieve anything you put your mind to. With education, drive and determination, you can reach your goals.


Graphic with a photo of Joy wearing an Android t-shirt on the right, and on the left, text that reads: “My Path to Google, Data Center Technician.”
10:25

We now do more computing where there’s cleaner energy

At Google, we care about the energy use of our data centers. In fact, we were the first major company to be carbon-neutral way back in 2007, and we’ve been matching 100% of our annual electricity use with renewable energy purchases since 2017. But we want to go even further. By 2030, we plan to completely decarbonize our electricity use for every hour of every day. One way we can do this is by adjusting our operations in real time so that we get the most out of the clean energy that’s already available. 

And that’s exactly what our newest milestone in carbon-intelligent computing does: Google can now shift moveable compute tasks between different data centers, based on regional hourly carbon-free energy availability. This includes both variable sources of energy such as solar and wind, as well as “always-on” carbon-free energy such as our recently announced geothermal project. This moves us closer to our goal of operating on carbon-free energy everywhere, at all times, by 2030.  

Animated GIF illustrating how Google shifts compute tasks between data centers to better use carbon-free energy, and hence minimize the fleet’s carbon footprint.

Shifting compute tasks across location is a logical progression of our first step in carbon-aware computing, which was to shift compute across time. By enabling our data centers to shift flexible tasks to different times of the day, we were able to use more electricity when carbon-free energy sources like solar and wind are plentiful. Now, with our newest update, we’re also able to shift more electricity use to where carbon-free energy is available.

The amount of computing going on at any given data center varies across the world, increasing or decreasing throughout the day. Our carbon-intelligent platform uses day-ahead predictions of how heavily a given grid will be relying on carbon-intensive energy in order to shift computing across the globe, favoring regions where there’s more carbon-free electricity. The new platform does all this while still getting everything that needs to get done, done — meaning you can keep on streaming YouTube videos, uploading Photos, finding directions or whatever else.

We’re applying this first to our media processing efforts, which encodes, analyzes and processes millions of multimedia files like videos uploaded to YouTube, Photos and Drive. Like many computing jobs at Google, these can technically run in many places (of course, limitations like privacy laws apply). Now, Google's global carbon-intelligent computing platform will increasingly reserve and use hourly compute capacity on the most clean grids available worldwide for these compute jobs — meaning it moves as much energy consumption as possible to times and places where energy is cleaner, minimizing carbon-intensive energy consumption.

Google Cloud’s developers and customers can also prioritize cleaner grids, and maximize the proportion of carbon-free energy that powers their apps by choosing regions with better carbon-free energy (CFE) scores.

To learn more, tune in to the livestream of our carbon-aware computing workshop on June 17 at 8:00 a.m PT. And for more information on our journey towards 24/7 carbon-free energy by 2030, read CEO Sundar Pichai’s latest blog post.

We now do more computing where there’s cleaner energy

At Google, we care about the energy use of our data centers. In fact, we were the first major company to be carbon-neutral way back in 2007, and we’ve been matching 100% of our annual electricity use with renewable energy purchases since 2017. But we want to go even further. By 2030, we plan to completely decarbonize our electricity use for every hour of every day. One way we can do this is by adjusting our operations in real time so that we get the most out of the clean energy that’s already available. 

And that’s exactly what our newest milestone in carbon-intelligent computing does: Google can now shift moveable compute tasks between different data centers, based on regional hourly carbon-free energy availability. This includes both variable sources of energy such as solar and wind, as well as “always-on” carbon-free energy such as our recently announced geothermal project. This moves us closer to our goal of operating on carbon-free energy everywhere, at all times, by 2030.  

Animated GIF illustrating how Google shifts compute tasks between data centers to better use carbon-free energy, and hence minimize the fleet’s carbon footprint.

Shifting compute tasks across location is a logical progression of our first step in carbon-aware computing, which was to shift compute across time. By enabling our data centers to shift flexible tasks to different times of the day, we were able to use more electricity when carbon-free energy sources like solar and wind are plentiful. Now, with our newest update, we’re also able to shift more electricity use to where carbon-free energy is available.

The amount of computing going on at any given data center varies across the world, increasing or decreasing throughout the day. Our carbon-intelligent platform uses day-ahead predictions of how heavily a given grid will be relying on carbon-intensive energy in order to shift computing across the globe, favoring regions where there’s more carbon-free electricity. The new platform does all this while still getting everything that needs to get done, done — meaning you can keep on streaming YouTube videos, uploading Photos, finding directions or whatever else.

We’re applying this first to our media processing efforts, which encodes, analyzes and processes millions of multimedia files like videos uploaded to YouTube, Photos and Drive. Like many computing jobs at Google, these can technically run in many places (of course, limitations like privacy laws apply). Now, Google's global carbon-intelligent computing platform will increasingly reserve and use hourly compute capacity on the most clean grids available worldwide for these compute jobs — meaning it moves as much energy consumption as possible to times and places where energy is cleaner, minimizing carbon-intensive energy consumption.

Google Cloud’s developers and customers can also prioritize cleaner grids, and maximize the proportion of carbon-free energy that powers their apps by choosing regions with better carbon-free energy (CFE) scores.

To learn more, tune in to the livestream of our carbon-aware computing workshop on June 17 at 8:00 a.m PT. And for more information on our journey towards 24/7 carbon-free energy by 2030, read CEO Sundar Pichai’s latest blog post.

How we’re minimizing AI’s carbon footprint

A photograph of a textbook about computer architecture.

The book that led to my visit to Google.

When I first visited Google back in 2002, I was a computer science professor at UC Berkeley. My colleague John Hennessey and I were updating our textbook on computer architecture, and Larry Page — who rode a hot-rodded electric scooter at the time — agreed to show me how his then three-year-old company designed its computing for Search. I remember the setup was lean yet powerful: just 6,000 low-cost PC servers and 12,000 PC disks answering 70 million queries around the world, every day. It was my first real look at how Google built its computer systems from the ground up, optimizing for efficiency at every level.

When I joined the company in 2016, it was with the goal of helping research how to maximize the efficiency of computer systems built specifically for artificial intelligence. Last year, Google set an ambitious goal of operating on 24/7 carbon-free energy, everywhere, by the end of the decade. But at the same time, machine learning systems are quickly becoming larger and more capable. What will be the environmental impact of those systems — and how can we neutralize that impact going forward? 

Today, we’re publishing a detailed analysis that addresses both of those questions. It’s an account of the energy- and carbon-costs of training six state-of-the art ML models, including five of our own. (Training a model is like building infrastructure: You spend the energy to train the model once, after which it’s used and reused many times, possibly by hundreds of millions of people.) To our knowledge, it’s the most thorough evaluation of its kind yet published. And while we had reason to believe our systems were efficient, we were encouraged by just how efficient they turned out to be.

For instance, we found that developing the Evolved Transformer model, a more efficient version of the popular Transformer architecture for ML, emitted nearly 100 times less carbon dioxide equivalent than a widely cited estimate. Of the roughly 12.7 terawatt-hours of electricity that Google uses every year, less than 1/200th of a percent of it was spent training our most computationally demanding models.  

What’s more, our analysis found that there already exist many ways to develop and train ML systems even more efficiently: Specially designed models, processors and data centers can dramatically reduce energy requirements, while the right selection of energy sources can go a long way to reduce the carbon that’s emitted during training. In fact, the right combination of model, processor, data center and energy source can reduce the carbon footprint of training an ML system by 1000 times. 

There’s no one easy trick for achieving a reduction that large, so let’s unpack that figure.  Minimizing a system’s carbon footprint is a two-part problem: First you want to minimize the energy the system consumes, then you have to supply that energy from the cleanest source possible.

Our analysis took a closer look at GShard and Switch Transformer, two models recently developed at Google Research. They’re the largest models we’ve ever created, but they both use a technique called sparse activation that enables them to only use a small fraction of their total architecture for a given task. It’s a bit like how your brain uses a small fraction of its 100 billion neurons to help you read this sentence. The result is that these sparse models consume less than one tenth the energy that you’d expect of similarly sized dense models — without sacrificing accuracy.

But to minimize ML’s energy use, you need more than just efficient models — you also need efficient processors and data centers to train and serve them. Google’s Tensor Processing Units (TPUs) are specifically designed for machine learning, which makes them up to five times more efficient than off-the-shelf processors. And the cloud computing data centers that house those TPUs are up to twice as efficient as typical enterprise data centers. 

Once you’ve minimized your energy requirements, you have to think about where that energy originates. The electricity a data center consumes is determined by the grid where it’s located. And depending on what resources were used to generate the electricity on that grid, this may emit carbon. 

The carbon intensity of grids varies greatly across regions, so it really matters where models are trained. For instance, the mix of energy supplying Google’s Iowa data center produces 0.080kg of CO2e per kilowatt hour of electricity, when combining the electricity supplied by the grid and produced by Google’s wind farms in Iowa. That’s 5.4 times less than the U.S. average. 

Any one of these four factors — models, chips, data centers and energy sources — can have a sizable effect on the costs associated with developing an ML system. But their cumulative impact can be enormous.

When John and I updated our textbook with what we’d learned on our visit to Google back in 2002, we wrote that “reducing the power per PC [server]” presented “a major opportunity for the future.” Nearly 20 years later, Google has found many opportunities to streamline its systems — but plenty remain to be seized. As a result of our analysis, we’ve already begun shifting where we train our computationally intensive ML models. We’re optimizing data center efficiency by shifting compute tasks to times when low-carbon power sources are most plentiful. Our Oklahoma data center, in addition to receiving its energy from cleaner sources, will house many of our next generation of TPUs, which are even more efficient than their predecessors. And sparse activation is just one example of the algorithmic ingenuity Google is using to design ML models that work smarter, not harder.

Cleaner data centers, batteries included

On the rare occasions when a Google data center is affected by a power outage, we have to be ready to ramp up millions of watts of backup electricity in seconds. This is a daunting challenge, which our industry has typically met using diesel generators. But now we’re aiming to demonstrate that a better, cleaner solution has advanced far enough to keep the internet up and running. 

In Belgium, we’ll soon install the first ever battery-based system for replacing generators at a hyperscale data center. In the event of a power disruption, the system will help keep our users’ searches, e-mails, and videos on the move—without the pollution associated with burning diesel. 

But even more important is what will happen when Google doesn’t need emergency power. Whereas diesel generators sit idle most of the year, batteries are multi-talented team players: when we’re not using them, they’ll be available as an asset that strengthens the broader electric grid. 

Worldwide, we estimate there are over 20 gigawatts of backup diesel generators in service across the data center industry, representing a massive opportunity to deploy cleaner solutions. Our project in Belgium is a first step that we hope will lay the groundwork for a big vision: a world in which backup systems at data centers go from climate change problems to critical components in carbon-free energy systems. 

How data centers can anchor carbon-free electric grids

Wind and solar power are currently booming around the world, but sunny days and breezy hours don’t always align with a community's energy demand. Large-scale batteries at data centers can address this problem by banking renewable power when it’s abundant, and discharging it when it’s needed. Batteries can also help balance other kinds of variability on power grids, allowing for more cost-effective and efficient operations. Working in partnership with ELIA, the local transmission system operator in Belgium, we’ll strive to make our project a model for how data centers can become anchors for carbon-free electric grids.

Gif demonstrating data center energy storage

In fact, one reason we chose Belgium as the site for our project is because the local team already has a track record of implementing novel energy ideas. It was the first facility in our global fleet to run entirely without mechanical chillers—one of many reasons that the European Commission recognized it as a top performer for energy efficiency. It’s also the place where we’ve integrated our largest on-site renewable energy installation—more than 10,000 solar panels strong.

Toward a carbon-free world

We’ve been working for years to push Google toward a zero-carbon future--from our achievement of carbon neutrality since 2007, to reaching 100 percent renewable energy every year since 2017, and now pursuing our most ambitious goal yet, 24/7 carbon-free electricity by 2030. Our new battery project will help us operate more cleanly when the power grid goes down, and help the grid itself move towards a carbon-free future.

You can hear more about our broader efforts in Episode 4 of Google’s just-released podcast, "Where the Internet Lives," which gives an inside look at how data centers can lead on clean energy in a world confronting climate change.

Our data centers support Europe’s green economic recovery

In 2020, families, schools and businesses moved online more than ever due to the pandemic. All the Google services you rely on are powered by our data centres, and we’ve had to ensure this infrastructure works for everyone as demand increased—for businesses using Google Cloud and Google Meet, and for anyone who asks a question on Search, watches a YouTube video, or uses Google Maps to get from A to B. 

In the last few weeks, we’ve added new infrastructure to Europe that supports the continent’s digital growth. Last month in Hamina, Finland, we were delighted to welcome Prime Minister Sanna Marin as she visited the construction site of our sixth data center building. Last week, we opened a new data center in Denmark in Fredericia. And just this week in the Netherlands, our second Dutch data center started its operation in Middenmeer.

A European green transition, powered by sustainable infrastructure

We’re proud that our data centers operate the cleanest cloud in the industry. They are on average twice as energy efficient as a typical enterprise data center. Compared to five years ago, we now deliver around seven times as much computing power with the same amount of electrical power. 

Last week Europe announced its ambitious55 percent reduction target for CO2 emissions by 2030, in addition to its 32 percent renewable energy target. Google is helping to accelerate this transition, having supported nearly 1,700 megawatts of new renewable energy projects in Belgium, Sweden, Denmark, Finland and the Netherlands. And we are committed to supporting the EU Climate Pact, as technology will have a critical role to play in making the EU Green Deal vision a reality.

Taking the world’s greenest data center fleet to the next level

Our AI technology helps reduce the energy we use to cool our data centers by 30 percent, and we make it available for use by airports, shopping malls, hospitals, data centers and other commercial buildings and industrial facilities. 
But we’re not stopping there. A few months ago, we announced our Third Decade of Climate Action: an ambitious plan to help build a carbon-free future and operate on clean energy around the clock. This is far more challenging than the traditional approach of matching energy usage with renewable energy, but we’re working to get this done in the next nine years.

Contributing to European growth with our (new) data centers

In addition to enabling the greenest, cleanest cloud, all these sites bring economic growth and employment to local communities and to Europe. In Finland, our data center has brought €1.2 billion in investment and supported 1700 jobs every year since 2009. During construction of our Denmark data center, we spent over €600 million and supported 2600 jobs. And in the Netherlands, we’ve directly invested €2.5 billion since 2014.

In the next five years, we expect to anchor €2 billion in new carbon-free energy generation projects and green infrastructure in Europe, helping to develop new technologies to make round-the-clock carbon-free energy cheaper and more widely available. 

Investing in our local communities

Partnerships at the local level make all the difference to communities. We have long worked with local NGOs in our data center communities and have donated millions to important initiatives in Europe, including skills training in cooperation with local colleges and universities. 

We have supported multiple education programmes focused on STEM (science, technology, engineering and maths), as well as environmental and cultural projects. For example, in Denmark we recently supported two projects with the Museum Fredericia that will promote local history through virtual experiences. In the Netherlands, we’ve helped with the preservation of local bee and butterfly populations. And in Ireland, during COVID-19, we’ve assisted vulnerable communities, and have given grants to local schools to provide students with laptops and enable home schooling.

We are proud to invest in Europe’s digital infrastructure, contribute to the local communities we operate in and support Europe’s green transition. This will be a decisive decade, and we are committed to leading by example.

A new podcast explores the unseen world of data centers

Do you ever wonder where it all comes from? The words you’re reading right now, the music you stream or the program your kids use to do their homework? All that stuff can’t be just floating around in space ... can it? The internet has to live somewhere, right? 

Right. Every click you make online reaches across vast distances to retrieve information from racks of powerful computers inside some of the most secure buildings in the world. And then whatever you’re seeking appears in an instant. Even for the people who keep the machines running, the process feels like nothing short of magic. These buildings—where the Internet lives—are called data centers. Each data center exists in a real place, operated by real people in communities like Bridgeport, Alabama and Changhua County, Taiwan.

An animated GIF showing the logo of Where the Internet Lives.

Even at Google, only about one percent of employees ever get to set foot inside a data center. So to demystify these warehouse-scale computing facilities, a small team of Googlers and I spent the last year exploring them. Through the process, we got to know the people who design, build, operate and secure these buildings. We connected with outside experts and community members whose lives intersect with this infrastructure that keeps the digital economy moving. And today, we’re releasing the result of all this work: a new six-episode podcast called Where the Internet Lives.

As you listen, you’ll get a rare glimpse behind the walls and through multiple layers of security, literally going inside the machines that power the internet, guided by the people who keep them humming.

Along the way, you’ll learn how data centers work, what they mean to the communities that host them, the reasons data centers are some of the most secure buildings in the world and how efforts to operate data centers on 24/7 clean energy are transforming electrical grids across the globe.

Subscribe to the podcast now to be transported—at nearly the speed of light—to Where the Internet Lives. 

Click through the images below to read episode descriptions and take a peek at the engineering marvels that are today’s data centers.

One percent of Googlers get to visit a data center, but I did

For years I’ve wondered what it’s like behind the protected walls of a Google data center, and I’m not alone. In my job at Google, I spend my days working with developers. Our data centers are crucial to the work that they do, but most have never actually set foot inside a data center. And until recently, neither had I. I went on a mission to find answers to common questions like: Why are visits so tightly restricted? How secure is a Google data center? How do we meet regulatory requirements? Here's what I found out.

To keep our customers' data safe, we need to make sure the physical structure of the data center is absolutely secure. Each data center is protected with six layers of physical security designed to thwart unauthorized access. Watch the video above to follow my journey through these layers to the core of a data center, and read on to learn even more.

“Least privilege” is the rule to live by

badge swipe

There are two rules strictly enforced at all Google data centers. The “least privilege” protocol is the idea that someone should have only the bare minimum privileges necessary to perform their job. If your least privilege is to enter Layer 2, you won’t have luck moving to Layer 3. Each person’s access permissions are checked at badge readers that exist at every access point in a data center facility. Authorization measures happen everywhere using this protocol. 


Another rule exists that prevents a vehicle or individual closely following another to gain entry into a restricted area without a badge swipe. If the system detects a door open for too long, it immediately alerts security personnel. Any gate or door must close before the next vehicle or person can badge in and gain access.

Two security checks: badge first, then circle lock

circle lock

You’ve probably seen dual-authentication when you try to sign into an account and a one-time password is sent to your phone. We take a similar approach at the data centers to verify a person’s identity and access. At some layers in the data center, you’re required to swipe your badge, then enter a circle lock, or tubular doorway. You walk into a special "half portal" that checks your badge and scans your eyes to gain access to the next layer of the data center. It prevents tailgating because only one person is allowed in the circle lock at a time.

Shipments are received through a secure loading dock

The facility loading docks are a special section of Layer 3, used to receive and send shipments of materials, such as new hardware. Truck deliveries must be approved for access to Layer 3 to enter the dock. For further security, the loading dock room is physically isolated from the rest of the data center, and guard presence is required when a shipment is received or sent.

All hard drives are meticulously tracked

hard drive

Hard drive tracking is important to the security of your data because hard drives contain encrypted sensitive information. Google meticulously tracks the location and status of every hard drive within our data centers—from acquisition to destruction—using barcodes and asset tags. These asset tags are scanned throughout a hard drive's lifecycle in a data center from the time it’s installed to the time it's removed from circulation. Tracking hard drives closely ensures they don’t go missing or end up in the wrong hands.


We also make sure hard drives are properly functioning by doing frequent performance tests. If a component fails to pass a performance test, it’s deemed no longer usable. To prevent any sensitive information from living on that disk, we remove it from inventory to be erased and destroyed in Layer 6, Disk Erase. There, the disk erase formatter uses a multi-step process that wipes the disk data and replaces each bit of data with zeros. If the drive can’t be erased for any reason, it’s stored securely until it can be physically destroyed. 

Layered security extends into the tech itself

Our layered security approach isn’t just a physical safeguard for entering our data centers. It’s also how we protect the hardware and software that live in our data centers. At the deepest layer, most of our server boards and networking equipment are custom-designed by Google. For example, we design chips, such as the Titan hardware security chip, to securely identify and authenticate legitimate Google hardware. 

At the storage layer, data is encrypted while it travels in and out of the data center and when it’s stored at the data center. This means whether data is traveling over the internet moving between Google’s facilities, or stored on our servers, it’s protected. Google Cloud customers can even supply their own encryption keys and manage them in a third-party key management system deployed outside Google’s infrastructure. This defense-in-depth approach helps to expand our ability to mitigate potential vulnerabilities at every point

To learn more about our global data centers, visit our Data and Security page. We will also be sharing more about our security best practices during the upcoming Google Cloud Next ’20: OnAir event.

Our data centers now work harder when the sun shines and wind blows

Addressing the challenge of climate change demands a transformation in how the world produces and uses energy. Google has been carbon neutral since 2007, and 2019 marks the third year in a row that we’ve matched our energy usage with 100 percent renewable energy purchases. Now, we’re working toward 24x7 carbon-free energy everywhere we have data centers, which deliver our products to billions of people around the world. To achieve 24x7 carbon-free energy, our data centers need to work more closely with carbon-free energy sources like solar and wind. 

New carbon-intelligent computing platform

Our latest advancement in sustainability, developed by a small team of engineers, is a new carbon-intelligent computing platform. We designed and deployed this first-of-its kind system for our hyperscale (meaning very large) data centers to shift the timing of many compute tasks to when low-carbon power sources, like wind and solar, are most plentiful. This is done without additional computer hardware and without impacting the performance of Google services like Search, Maps and YouTube that people rely on around the clock. Shifting the timing of non-urgent compute tasks—like creating new filter features on Google Photos, YouTube video processing, or adding new words to Google Translate—helps reduce the electrical grid’s carbon footprint, getting us closer to 24x7 carbon-free energy.
Low carbon energy graphic

Visualization of how we shift compute tasks to different times of day to align with the availability of lower-carbon energy. In this illustration, wind energy in the evening and solar energy during the day.

Each day, at every Google data center, our carbon-intelligent platform compares two types of forecasts for the following day. One of the forecasts, provided by our partner Tomorrow, predicts how the average hourly carbon intensity of the local electrical grid will change over the course of a day. A complementary Google internal forecast predicts the hourly power resources that a data center needs to carry out its compute tasks during the same period. Then, we use the two forecasts to optimize hour-by-hour guidelines to align compute tasks with times of low-carbon electricity supply. Early results demonstrate carbon-aware load shifting works. Results from our pilot suggest that by shifting compute jobs we can increase the amount of lower-carbon energy we consume. 

Baseline vs carbon-aware load

Data from our pilot illustrates how the new system shifts compute from our baseline (dashed line) to better align with less carbon-intensive times of the day—such as early morning and late evening (solid line)—when wind energy is most plentiful. Gray shading represents times of day when more carbon-intensive energy is present on the grid.

What’s next

The first version of this carbon-intelligent computing platform focuses on shifting tasks to different times of the day, within the same data center. But, it’s also possible to move flexible compute tasks between different data centers, so that more work is completed when and where doing so is more environmentally friendly. Our plan for the future is to shift load in both time and location to maximize the reduction in grid-level CO2 emissions. Our methodology, including performance results of our global rollout, will be shared in upcoming research publications. We hope that our findings inspire other organizations to deploy their own versions of a carbon-intelligent platform, and together, we can continue to encourage the growth of carbon-free electricity worldwide. Learn more about Google’s progress toward a carbon-free future on our Sustainability site.