Tag Archives: Infrastructure

DeepMind AI reduces energy used for cooling Google data centers by 40%

From smartphone assistants to image recognition and translation, machine learning already helps us in our everyday lives. But it can also help us to tackle some of the world’s most challenging physical problems -- such as energy consumption.  Large-scale commercial and industrial systems like data centers consume a lot of energy, and while much has been done to stem the growth of energy use, there remains a lot more to do given the world’s increasing need for computing power.

Reducing energy usage has been a major focus for us over the past  10 years: we have built our own super-efficient servers at Google, invented more efficient ways to cool our data centers and invested heavily in green energy sources, with the goal of being powered 100 percent by renewable energy. Compared to five years ago, we now get around 3.5 times the computing power out of the same amount of energy, and we continue to make many improvements each year.

Major breakthroughs, however, are few and far between -- which is why we are excited to share that by applying DeepMind’s machine learning to our own Google data centers, we’ve managed to reduce the amount of energy we use for cooling by up to 40 percent. In any large scale energy-consuming environment, this would be a huge improvement. Given how sophisticated Google’s data centers are already, it’s a phenomenal step forward.

The implications are significant for Google’s data centers, given its potential to greatly improve energy efficiency and reduce emissions overall. This will also help other companies who run on Google’s cloud to improve their own energy efficiency. While Google is only one of many data center operators in the world, many are not powered by renewable energy as we are. Every improvement in data center efficiency reduces total emissions into our environment and with technology like DeepMind’s, we can use machine learning to consume less energy and help address one of the biggest challenges of all -- climate change.

One of the primary sources of energy use in the data center environment is cooling. Just as your laptop generates a lot of heat, our data centers -- which contain servers powering Google Search, Gmail, YouTube, etc. -- also generate a lot of heat that must be removed to keep the servers running. This cooling is typically accomplished via large industrial equipment such as pumps, chillers and cooling towers. However, dynamic environments like data centers make it difficult to operate optimally for several reasons: 

  1. The equipment, how we operate that equipment, and the environment interact with each other in complex, nonlinear ways. Traditional formula-based engineering and human intuition often do not capture these interactions.
  2. The system cannot adapt quickly to internal or external changes (like the weather). This is because we cannot come up with rules and heuristics for every operating scenario.
  3. Each data center has a unique architecture and environment. A custom-tuned model for one system may not be applicable to another. Therefore, a general intelligence framework is needed to understand the data center’s interactions.
To address this problem, we began applying machine learning two years ago to operate our data centers more efficiently. And over the past few months, DeepMind researchers began working with Google’s data center team to significantly improve the system’s utility. Using a system of neural networks trained on different operating scenarios and parameters within our data centers, we created a more efficient and adaptive framework to understand data center dynamics and optimize efficiency.

We accomplished this by taking the historical data that had already been collected by thousands of sensors within the data center -- data such as temperatures, power, pump speeds, setpoints, etc. -- and using it to train an ensemble of deep neural networks. Since our objective was to improve data center energy efficiency, we trained the neural networks on the average future PUE (Power Usage Effectiveness), which is defined as the ratio of the total building energy usage to the IT energy usage. We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data center over the next hour. The purpose of these predictions is to simulate the recommended actions from the PUE model, to ensure that we do not go beyond any operating constraints.

We tested our model by deploying on a live data center. The graph below shows a typical day of testing, including when we turned the machine learning recommendations on, and when we turned them off.

Green_-_07_20_16_-_Deepmind_Reduces_Energy.width-1600.png
Google DeepMind graph showing results of machine learning test on power usage effectiveness in Google data centers

Our machine learning system was able to consistently achieve a 40 percent reduction in the amount of energy used for cooling, which equates to a 15 percent reduction in overall PUE overhead after accounting for electrical losses and other non-cooling inefficiencies. It also produced the lowest PUE the site had ever seen. 

Because the algorithm is a general-purpose framework to understand complex dynamics, we plan to apply this to other challenges in the data center environment and beyond in the coming months. Possible applications of this technology include improving power plant conversion efficiency (getting more energy from the same unit of input), reducing semiconductor manufacturing energy and water usage, or helping manufacturing facilities increase throughput.

We are planning to roll out this system more broadly and will share how we did it in an upcoming publication, so that other data center and industrial system operators -- and ultimately the environment -- can benefit from this major step forward.

More Nordic wind power for our European data centers

At the end of last year, we announced that we were purchasing a whopping 842 megawatts (MW) of additional renewable energy to power our operations and take us one step closer to running 100 percent of our operations on clean energy. Today, we walked further down that path by agreeing to purchase an additional 236 MW of energy from two new wind farms in Norway and Sweden.

These new Nordic power purchase agreements complement our three other Swedish wind deals and enable us to power even more of our European operations with renewable energy. In total, we now have seven purchase agreements in Europe, totalling more than 500 MW and 18 such deals globally, which means we’ve now purchased nearly 2.5 gigawatts (GW) worldwide — the equivalent of taking over 1 million cars off the road.

nordic_wind_power.width-1071.jpg
Photo of wind turbine in Sweden by BMJ via Shutterstock
As with our other power purchase agreements, we’re buying the entire production of these new wind farms, situated in two great areas for onshore wind in Europe. In Norway, power will be generated by a 50-turbine project near Stavanger, which is set to be completed in late 2017. In Sweden, we’re buying power from a 22-turbine project, near Mariestad and Töreboda, which will be completed by early 2018. In both cases, we’ve signed long-term contracts that give us price certainty and help wind farm developers secure construction financing, in these cases from companies like Blackrock and Ardian.
One of our key goals is to enable the addition of new renewable energy generation capacity to the grid, rather than drawing power from existing facilities. And thanks to Europe’s increasingly integrated energy market, we’re able to buy wind energy in Norway and Sweden, and consume it elsewhere in Europe.

We’ve known for a long time that reducing energy usage and using renewables makes good business sense — we signed our first major power purchase agreement for 114 MW of Iowa wind in 2010. Others are discovering the benefits of renewables too — in the U.S. alone, companies bought almost 3.5 GW of renewable energy last year. We’re pleased to have played a part in stimulating the market for corporate renewable energy purchasing and doing our share in the effort to mitigate climate change.

Data centers get fit on efficiency

Google’s efforts to build the world’s most efficient data centers are beginning to give back -- in energy. A study just released by the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) shows that in the last five years, data center efficiency has kept energy usage almost flat despite a huge growth in demand for computing power.

In fact, compared to five years ago, we can now deliver over 3.5 times as much computing power for the same amount of electrical power. That means that even though we’re sending more email, watching more YouTube videos, and saving more digital photos, we’re using the same amount of energy.

Let’s dig into some numbers from the report:

  • In 2014, U.S. data centers used 70 billion kWh of energy -- equal to powering more than six million homes for a year.
  • This is a big shift in energy consumption:
       •  From 2000 to 2005, usage grew 90 percent.
       •  From 2005 to 2010, usage grew 24 percent;
       •  From 2010 to 2014, usage grew 4 percent.
  • Energy use is expected to increase at the same rate of four percent from 2014 to 2020.

Stabilizing data center energy usage is great, but at Google, we believe we will go further than simply stopping the growth. As more IT users transition to public clouds and mobile use increases, total energy usage will likely go down even more. On the server side, ultra-efficient cloud capacity replaces older, less efficient corporate data centers, and on the client side, battery life pressures ensure that mobile devices use much less energy than desktops.

The cloud supports many products at a time, so it can more efficiently distribute resources among many users. That means we can do more with less energy—and businesses can too. In 2013, the Berkeley Lab published research we helped support, indicating that moving all office workers in the United States to the cloud could reduce the energy used by information technology by up to 87 percent. That’s equal to powering the city of Los Angeles for one year.
Screen_Shot_2016-06-27_at_5.42.18_PM.width-899.png
2013 U.S. Case Study: Energy Efficiency Potential of Cloud-based Software (Berkeley Lab)

2013 U.S. Case Study: Energy Efficiency Potential of Cloud-based Software (Berkeley Lab) Efficiency in data center operations like Google’s comes from shifting to super efficient computing, along with improvements in storage, network and infrastructure, employing more advanced cooling strategies, better power management software, and consolidating servers.

We are focused on creating platforms where everyone can benefit. Google builds hyperscale data centers that are designed to maximize infrastructure efficiency. We also began publishing our efficiency data in 2008 and have been promoting techniques for more efficient energy use to leaders in the IT industry, starting with the first data center efficiency summit in 2009 and our continued advances with machine learning.

These results show the rapid impact efficiency can have on the industry and the persistent opportunity we have to reduce energy use while creating a more powerful web.

Announcing Google Cloud Platform Education Grants for computer science

While university students are on their summer holidays, internships or jobs, their professors are already hard at work planning for fall courses. These course maps will be at the center of student learning, research and academic growth. Google was founded on the basis of the work that Larry and Sergey did as computer science students at Stanford, and we understand the critical role that teachers play in fostering and inspiring the innovation we see today and will see in the years to come. That’s why we’re excited to offer Google Cloud Platform Education Grants for computer science

Starting today, university faculty in the United States who teach courses in computer science or related subjects can apply for free credits for their students to use across the full suite of Google Cloud Platform tools, like App Engine and the Cloud Machine Learning Platform. These credits can be used any time during the 2016-17 academic year and give students access to the same tools and infrastructure used by Google engineers.
GCP_logo_vertical_rgb.png
Students like Duke University undergrad Brittany Wenger are already taking advantage of cloud computing. After watching several women in her family suffer from breast cancer, Brittany used her knowledge of artificial intelligence to create Cloud4Cancer, an artificial neural network built on top of Google App Engine. By analyzing uploaded scans of benign and malignant breast cancer tumors, Cloud4Cancer has learned to distinguish between healthy and unhealthy tissue. It’s providing health care professionals with a powerful diagnostic tool in the fight against cancer.

Google Cloud Platform offers a range of tools and services that are unique among cloud providers. The tool that Brittany used -- Google App Engine -- lets you simply build and run an application without having to configure custom infrastructure. Our Machine Learning platform allows you to build models for any type of data, at any size, and TensorFlow provides access to an open-source public software library (tinker with that extensive data here). Students will also be able to get their hands on one of Cloud Platform’s most popular new innovations: the Cloud Vision API, which allows you to incorporate Google’s state-of-the-art image recognition capabilities into the most basic web or mobile app.

We look forward to seeing the creative ways that computer science students will use their Google Cloud Platform Education Grants, and will share stories along the way on this blog.

Computer science faculty in the United States can apply here for Education Grants. Students and others interested in Cloud Platform for Higher Education, should complete this form to register and stay up to date with the latest from Cloud Platform. For more information on Cloud Platform and its uses for higher education, visit our Google Cloud Platform for Higher Education site.

Source: Education


Powering the Internet with renewable energy

Today we're announcing the largest, and most diverse, purchase of renewable energy ever made by a non-utility company. Google has already committed to purchase more renewable energy than any other company. Now, through a series of new wind and solar projects around the world, we’re one step closer to our commitment to triple our purchases of renewable energy by 2025 and our goal of powering 100% of our operations with clean energy.

842 MW of renewable energy around the world

Today’s agreements will add an additional 842 megawatts of renewable energy capacity to power our data centers. Across three countries, we’re nearly doubling the amount of renewable energy we’ve purchased to date. We’re now up to 2 gigawatts—the equivalent to taking nearly 1 million cars off the road.

These additional 842 megawatts represent a range of locations and technologies, from a wind farm in Sweden to a solar plant in Chile.

gigawatts.width-975.png

These long-term contracts range from 10-20 years and provide projects with the financial certainty and scale necessary to build these wind and solar facilities—thus bringing new renewable energy onto the grid in these regions. For our part, these contracts not only help minimize the environmental impact of our services—they also make good business sense by ensuring good prices.

Our commitment to a sustainable energy future

Since we opened our very first owned data center in 2006, we’ve been working to promote renewable and sustainable energy use in several ways:

  • First, we’re building the world’s most efficient computer infrastructure by designing our data centers to use as little energy as possible.
  • Second, we're driving the renewables industry forward by fully committing to renewable sources. In 2010, we entered our first large-scale renewable power purchase agreement with a wind farm in Iowa, and we subsequently completed a number of similar large-scale energy purchases over the past five years. Today’s announcement is another milestone in this area.
  • Third, we've worked with our utility partners to help promote transformation in the utility sector. In 2013 we created a new program that enables customers like Google to buy large amounts of renewable energy directly from their utilities. Today's announcement includes the first solar project enrolled under that program. And this past summer we announced that our newest data center will be on located on the site of a retiring coal plant and will be 100% renewable powered from day one.
  • Fourth, beyond our efforts to power our own operations with renewables, we’ve made separate agreements to fund $2.5 billion into 22 large-scale renewable energy projects over the last five years, from Germany to Kansas to Kenya. These investments have been in some of the largest and most transformative renewable energy projects in the world with a goal to help drive renewable energy development not only as a customer but as an investor, and bring down costs for everyone.

And we’re also working on new technologies and ideas—ranging from Project Sunroof to Makani Power to air quality monitoring—that we hope can make a cleaner energy future an option for many more people.

With world leaders coming together at the COP21 UN conference on climate change in Paris this week, there's no better time to focus on renewable energy. We hope that our efforts play a small part in boosting all of us in the race to solve climate change.