Author Archives: Urs Hölzle

Keeping our network infrastructure strong amid COVID-19

Google's network supports products that people around the world rely on every day, like YouTube, Search, Maps and Gmail. It also connects Google Cloud customers to their employees and users. As the coronavirus pandemic spreads and more people move to working or learning from home, it’s natural to wonder whether the Google network can handle the load. The short answer is yes. 

We’ve designed our network to perform during times of high demand. The same systems we built to handle peaks like the Cyber Monday online shopping surge, or to stream the World Cup finals, support increased traffic as people turn to Google to find news, connect with others, and get work done during this pandemic. And while we’re seeing more usage for products like Hangouts Meet, and different usage patterns in products like YouTube, peak traffic levels are well within our ability to handle the load. 

Google’s network consists of a system of high-capacity fiber optic cables that encircle the globe, under both land and sea, connecting our data centers to each other, and to you. Traffic flows over our dedicated network, optimized for speed and reliability until we hand it off to more than 3,000 internet service providers (ISPs) in 200+ countries and territories for local delivery—the “last mile”—using hundreds of points of presence and thousands of edge locations around the world.

Handling traffic on Google’s infrastructure and bringing it close to people helps limit the burden on operators—whose networks have different levels of reserve capacity—to allow them to focus on delivering that last mile. Together, we work to provide the best possible experience for browsing, video-conferencing, streaming, making purchases online, and more to people around the world. We’re continuing to work with governments and network operators around the globe as we do our part to minimize stress on the system. As part of this, we recently announced that we are temporarily defaulting all videos on YouTube to standard definition.  

We also recognize the importance of Google services at a time like this and continue to add capacity to stay ahead of demand. Our dedicated global network deployment and operations team is increasing capacity wherever needed, and, in the event of a disruption, recovers service as quickly as possible. 

This may be a time of global uncertainty, but we're working hard to ensure the Google network is there for everyone, business or consumer, day and night.

Data centers are more energy efficient than ever

While Google is the world’s largest corporate purchaser of renewable energy, we’re also taking action on climate change by minimizing the amount of energy we need to use in the first place. For more than a decade, we’ve worked to make our data centers as energy efficient as possible. Today, a new paper in Science validated our efforts and those of other leaders in our industry. It found that efficiency improvements have kept energy usage almost flat across the globe’s data centers—even as demand for cloud computing has skyrocketed.

The new study shows that while the amount of computing done in data centers increased by about 550 percent between 2010 and 2018, the amount of energy consumed by data centers only grew by six percent during the same time period. The study’s authors note that these energy efficiency gains outpaced anything seen in other major sectors of the economy. As a result, while data centers now power more applications for more people than ever before, they still account for about 1 percent of global electricity consumption—the same proportion as in 2010. 

What's more, research has consistently shown that hyperscale (meaning very large) data centers are far more energy efficient than smaller, local servers. That means that a person or company can immediately reduce the energy consumption associated with their computing simply by switching to cloud-based software. As the data center industry continues to evolve its operations, this efficiency gap between local computing and cloud computing will continue to grow.

Searching for efficiency

How are data centers squeezing more work out of every electron, year after year? For Google, the answer comes down to a relentless quest to eliminate waste, at every level of our operations. We designed highly efficient Tensor Processing Units, (the AI chips behind our advances in machine learning), and outfitted all of our data centers with high-performance servers. Starting in 2014, we even began using machine learning to automatically optimize cooling in our data centers. At the same time, we’ve deployed smart temperature, lighting, and cooling controls to further reduce the energy used at our data centers. 

Our efforts have yielded promising results: Today, on average, a Google data center is twice as energy efficient as a typical enterprise data center. And compared with five years ago, we now deliver around seven times as much computing power with the same amount of electrical power. 

By directly controlling data center cooling, our AI-powered recommendation system is already delivering consistent energy savings of around 30 percent on average. And the average annual power usage effectiveness for our global fleet of data centers in 2019 hit a new record low of 1.10, compared with the industry average of 1.67—meaning that Google data centers use about six times less overhead energy for every unit of IT equipment energy.

Leading by example

So where do we go from here? We’ll continue to deploy new technologies and share the lessons we learn in the process, design the most efficient data centers possible, and disclose data on our progress. To learn about our efforts to power the internet using as little power as possible—and how we’re ensuring that the energy we use is carbon-free, around the clock—check out our latest Environment Report or visit our data center efficiency site.

Measuring our impact in data center communities

Over 10 years ago, we built our first data center in Oregon. And a few weeks ago we broke ground on what will be our eighth data center in the U.S., helping to run Google’s products across the country and the world.

These data centers contribute significantly to job growth and income gains at both the national and state level. Even more important are the economic contributions that Google data centers make to the communities they call home.

Today, we’re releasing a report, prepared by Oxford Economics, which details the economic impact our data centers have had in their local communities. The report concludes that, as of 2016, Google data centers generated $1.3 billion in economic activity across the US, and have generated over 11,000 jobs.

Those 11,000 jobs cause a ripple effect—people with greater financial flexibility can support the local economy, which has led to the creation of an additional 4,700 jobs. In fact, when direct, indirect and induced jobs are considered, the report finds that each Google data center job supports an additional 4.9 jobs throughout the U.S.

Last year, we became the first company of our size to purchase enough energy from sources like wind and solar to exceed the amount of electricity used by our operations around the world, including offices and data centers. This commitment to renewables has economic and environmental benefits. Oxford’s report shows that eight U.S. renewable energy generation projects—most of which are located in states where we have data centers—resulted in over $2 billion of investments, created 2,800 direct jobs, and supported 520 ongoing jobs in maintenance and operations.

What we’re most proud of, however, are the ways we invest in our local communities through workforce development and education. Our community grants program supports important local initiatives, like installing Wi-Fi on school buses for kids with long commutes, and partnering with school districts to develop student STEM programs.

We are proud of our economic impact in communities across the country, but here at Google, it’s about more than just the numbers. It’s about the people we hire and the communities where we live and work.

Meeting our match: Buying 100 percent renewable energy

A little over a year ago, we announced that we were on track to purchase enough renewable energy to match all the electricity we consumed over the next year. We just completed the accounting for Google’s 2017 energy use and it’s official—we met our goal. Google’s total purchase of energy from sources like wind and solar exceeded the amount of electricity used by our operations around the world, including offices and data centers.


What do we mean by “matching” renewable energy? Over the course of 2017, across the globe, for every kilowatt hour of electricity we consumed, we purchased a kilowatt hour of renewable energy from a wind or solar farm that was built specifically for Google. This makes us the first public Cloud, and company of our size, to have achieved this feat.


Today, we have contracts to purchase three gigawatts (3GW) of output from renewable energy projects; no corporate purchaser buys more renewable energy than we do. To date, our renewable energy contracts have led to over $3 billion in new capital investment around the world.

The road to 100 percent

We've been working toward this goal for a long time. At the outset of last year, we felt confident that 2017 was the year we'd meet it. Every year, we sign contracts for new renewable energy generation projects in markets where we have operations. From the time we sign a contract, it takes one to two years to build the wind farm or solar field before it begins producing energy. In 2016, our operational projects produced enough renewables to cover 57 percent of the energy we used from global utilities. That same year, we signed a record number of new contracts for wind and solar developments that were still under construction. Those projects began operating in 2017—and that additional output of renewable energy was enough to cover more than 100 percent of what we used during the whole year.


We say that we “matched” our energy usage because it’s not yet possible to “power” a company of our scale by 100 percent renewable energy. It’s true that for every kilowatt-hour of energy we consume, we add a matching kilowatt-hour of renewable energy to a power grid somewhere. But that renewable energy may be produced in a different place, or at a different time, from where we’re running our data centers and offices. What’s important to us is that we are adding new clean energy sources to the electrical system, and that we’re buying that renewable energy in the same amount as what we’re consuming, globally and on an annual basis.

Google's data center in Eemshaven, The Netherlands.
Google's data center in Eemshaven, The Netherlands.

Looking ahead

We’re building new data centers and offices, and as demand for Google products grows, so does our electricity load. We need to be constantly adding renewables to our portfolio to keep up. So we’ll keep signing contracts to buy more renewable energy. And in those regions where we can’t yet buy renewables, we’ll keep working on ways to help open the market. We also think every energy buyer—individuals and businesses alike—should be able to choose clean energy. We’re working with groups like the Renewable Energy Buyers Alliance and Re-Source Platform to facilitate greater access to renewably-sourced energy.


This program has always been a first step for us, but it is an important milestone in our race to a carbon-free future. We do want to get to a point where renewables and other carbon-free energy sources actually power our operations every hour of every day. It will take a combination of technology, policy and new deal structures to get there, but we're excited for the challenge. We can’t wait to get back to work.

Source: Google Cloud


Security in the cloud

Security is one of the biggest issues of our time. Countless companies and governments have lost data because of security incidents. And just one breach could cost millions in fines and lost business—and most importantly, lose customer trust.

As a result, security is increasingly top of mind for CEOs and Boards of Directors. That’s why, this week, I’ll join Google Cloud CEO Diane Greene and many of our colleagues in New York, where we’ll meet with more than 100 CEOs to discuss security in the cloud.

At its most basic level, security is a human issue. Whether performed by individuals or organizations, cybersecurity attacks are ultimately carried out by people, regardless of motive.

Often these attacks rely on exploiting human nature, such as through phishing emails. And it’s people that they ultimately affect. By some accounts, 179 million personal records were exposed just in 2017 through data breaches.

And as a human issue, security is something we can tackle together.

Leveraging the cloud to protect against threats

Cloud providers offer a vast army of experts to protect against threats—one far larger than almost any internal team a company could invest in. In fact, if businesses were to go it alone, there wouldn’t be enough security professionals in the world to adequately protect every single company and their users.

In industries from financial services to healthcare to retail, companies are relying on the automation and scale offered by the cloud to protect their data and that of their customers—allowing their employees to focus on building their business. Many are coming to the same conclusion we have: In many cases, if you’re not moving to the cloud, you’re risking your business.

Take the CPU vulnerabilities that were disclosed in January, for example. These were major discoveries; they rocked the tech industry. But for the most part, cloud customers could go about their business. Here at Google Cloud, we updated our infrastructure through Live Migration, which required no reboots, no customer downtime, and did not materially impact performance. In fact, we got calls from customers asking if we had updated our systems to protect against the vulnerabilities—because they experienced no impact.

These won’t be the last security vulnerabilities to be uncovered; humans will never write perfect code. But the cloud makes it much easier to stay on top of them. The scale of the cloud security teams that find and mitigate emerging threats, the ability to update many systems at scale, and the automation to scan, update and protect users all contribute to cloud’s unique position to keep information and people secure.


Security at Google Cloud

Security has been paramount to Google from the very beginning. (I would know!) We’ve been operating securely in the cloud for almost 20 years, and we have seven apps with more than a billion users that we protect from threats every single day, and GCP itself connects to more than a billion IPs every day. We believe that security empowers innovation—that if you put security first, everything else will follow.

Security is in the details—and we pay attention at the most granular level. We were the first to introduce SSL email by default in 2010, we created the U2F security token standard in 2014, Chrome was the first browser to support post-quantum crypto in 2016, and in 2017 we introduced Titan, a purpose-built chip to establish hardware root of trust for both machines and peripherals on cloud infrastructure. These examples show the level of depth that we go into when thinking about security, and the role we take in pushing the industry forward to stay on top of evolving threats.

In addition, Google’s Project Zero team hunts for vulnerabilities across the internet, and have been behind the discoveries of “Heartbleed” as well as the recently-discovered “Spectre” and “Meltdown.” We also provide incentives to the security community to help us look for and find security bugs through our Vulnerability Reward Program.

We know how complex the security landscape is, and we’ve spent a lot of time thinking about how to solve this tough challenge. We’ve developed principles around security that define how we build our infrastructure, how we build our products, and how we operate.

For example, we believe it’s not enough to build something and try to make it secure after the fact. Security should be fundamental to all design, not bolted on to an old paradigm. That’s why we build security through progressive layers that deliver true defense in depth, meaning our cloud infrastructure doesn’t rely on any one technology to make it secure.

Now more than ever, it’s important for companies to make security an utmost priority and take responsibility for protecting their users. That’s true for Google too. At the end of the day, any organization is accountable to people above all, and user trust is crucial to business. If we don’t get security right, we don’t have a business.

That’s one of the reasons why I’m so passionate about cloud as a means to improve security. Google has always worked to protect users across the internet. With Google Cloud, we’re extending those capabilities to help businesses protect their users as well.

In the coming days, we'll share more about how we're pushing cloud security forward. Stay tuned.

Source: Google Cloud


Security in the cloud

Security is one of the biggest issues of our time. Countless companies and governments have lost data because of security incidents. And just one breach could cost millions in fines and lost business—and most importantly, lose customer trust.

As a result, security is increasingly top of mind for CEOs and Boards of Directors. That’s why, this week, I’ll join Google Cloud CEO Diane Greene and many of our colleagues in New York, where we’ll meet with more than 100 CEOs to discuss security in the cloud.

At its most basic level, security is a human issue. Whether performed by individuals or organizations, cybersecurity attacks are ultimately carried out by people, regardless of motive.

Often these attacks rely on exploiting human nature, such as through phishing emails. And it’s people that they ultimately affect. By some accounts, 179 million personal records were exposed just in 2017 through data breaches.

And as a human issue, security is something we can tackle together.


Leveraging the cloud to protect against threats


Cloud providers offer a vast army of experts to protect against threats—one far larger than almost any internal team a company could invest in. In fact, if businesses were to go it alone, there wouldn’t be enough security professionals in the world to adequately protect every single company and their users.

In industries from financial services to healthcare to retail, companies are relying on the automation and scale offered by the cloud to protect their data and that of their customers—allowing their employees to focus on building their business. Many are coming to the same conclusion we have: In many cases, if you’re not moving to the cloud, you’re risking your business.

Take the CPU vulnerabilities that were disclosed in January, for example. These were major discoveries; they rocked the tech industry. But for the most part, cloud customers could go about their business. Here at Google Cloud, we updated our infrastructure through Live Migration, which required no reboots, no customer downtime, and did not materially impact performance. In fact, we got calls from customers asking if we had updated our systems to protect against the vulnerabilities—because they experienced no impact.

These won’t be the last security vulnerabilities to be uncovered; humans will never write perfect code. But the cloud makes it much easier to stay on top of them. The scale of the cloud security teams that find and mitigate emerging threats, the ability to update many systems at scale, and the automation to scan, update and protect users all contribute to cloud’s unique position to keep information and people secure.


Security at Google Cloud


Security has been paramount to Google from the very beginning. (I would know!) We’ve been operating securely in the cloud for almost 20 years, and we have seven apps with more than a billion users that we protect from threats every single day, and GCP itself connects to more than a billion IPs every day. We believe that security empowers innovation—that if you put security first, everything else will follow.

Security is in the details—and we pay attention at the most granular level. We were the first to introduce SSL email by default in 2010, we created the U2F security token standard in 2014, Chrome was the first browser to support post-quantum crypto in 2016, and in 2017 we introduced Titan, a purpose-built chip to establish hardware root of trust for both machines and peripherals on cloud infrastructure. These examples show the level of depth that we go into when thinking about security, and the role we take in pushing the industry forward to stay on top of evolving threats.

In addition, Google’s Project Zero team hunts for vulnerabilities across the internet, and have been behind the discoveries of “Heartbleed” as well as the recently-discovered “Spectre” and “Meltdown.” We also provide incentives to the security community to help us look for and find security bugs through our Vulnerability Reward Program.

We know how complex the security landscape is, and we’ve spent a lot of time thinking about how to solve this tough challenge. We’ve developed principles around security that define how we build our infrastructure, how we build our products, and how we operate.

For example, we believe it’s not enough to build something and try to make it secure after the fact. Security should be fundamental to all design, not bolted on to an old paradigm. That’s why we build security through progressive layers that deliver true defense in depth, meaning our cloud infrastructure doesn’t rely on any one technology to make it secure.

Now more than ever, it’s important for companies to make security an utmost priority and take responsibility for protecting their users. That’s true for Google too. At the end of the day, any organization is accountable to people above all, and user trust is crucial to business. If we don’t get security right, we don’t have a business.

That’s one of the reasons why I’m so passionate about cloud as a means to improve security. Google has always worked to protect users across the internet. With Google Cloud, we’re extending those capabilities to help businesses protect their users as well.

In the coming days, we'll share more about how we're pushing cloud security forward. Stay tuned.

Source: Google Cloud


Freedom of data movement in the cloud era

In January, we joined an amicus brief with other technology companies in a case pending before the Supreme Court involving Microsoft and the U.S. Department of Justice. The companies that joined the brief argue that Congress must act to resolve the complicated policy questions raised by the case, as Congress is best-suited to weigh the important interests of law enforcement, foreign countries, service providers and, of course, the people who use the services.

Pending legislation in the U.S. Congress—the Clarifying Lawful Overseas Use of Data (CLOUD) Act—would make important strides in addressing the issues raised in the Microsoft case by updating the decades-old Electronic Communications Privacy Act. Notably, the bill clarifies that the physical location of data is not a relevant criterion in determining the data disclosure obligations of U.S. service providers.

We wanted to share a little more information on why we think this is important and what it means for our customers and users. Modern distributed networks function in ways that do not focus on data location. As more people and businesses turn to the cloud to keep their data secure and ensure their services are dependable, infrastructure has had to grow and evolve to meet those demands. Global networks offer end users a level of dependability that previously required the most sophisticated backup technologies and significant individual hardware investment. Understanding how a global distributed network like ours works is key to understanding the benefits it offers and the challenges that are presented by laws that focus on where data is stored.

Growth of the public cloud

It’s been an important goal of Internet companies like ours to offer services that can be accessed by hundreds-of-millions of users no matter where they are. These services have to be fast, reliable, robust, and resilient. From our earliest days, it was essential that our index, with its links to vast swaths of content, be as comprehensive as possible. But beyond that, it was also critical that the service be fast. Increasing the speed of search meant a vastly improved experience for users otherwise accustomed to long load times over slow internet connections.

Through the years, we’ve worked hard to continually improve how we serve users in all corners of the world. From an infrastructure perspective, this has meant focusing on how best to route data securely, balance processing loads and storage needs, and prevent data loss, corruption, and outages.

Public cloud services operate on a global basis, using geographically distributed infrastructure to ensure that the services that run on them have maximum availability and uptime. Data typically no longer resides on a single hard drive or server rack, or even in a single data center. Instead, it must be stored, secured, and made available in a way that allows it to be accessed by the users who depend on it just as easily in India as in Germany.


Focus on the user

The way we handle data is driven by what’s best for our users, regardless of whether that user is an individual or a large enterprise. To provide them with the reliability, efficiency, resiliency, and speed they depend on, data might need to be stored in many different configurations across a global network.

Cloud infrastructure also offers business customers more control over where and how their data is stored, depending on their needs. These customers may choose to store their data in a country or data center near their corporate headquarters, or as close to their users as possible.

With customer needs in mind, cloud providers balance factors ranging from internet bandwidth, the likelihood of power outages over available networks, and network throughput. This short video explains how these considerations come to life on a distributed network, using the photo a Gmail user attaches to a message as an example.

Enhancing the security and integrity of your data

As this video explains, individual data files may be broken up into smaller pieces, stored, or moved to keep them safe and accessible. Modern internet networks increasingly transmit and store data intelligently, often moving and replicating data seamlessly between data centers and across borders in order to protect the integrity of the data and maximize efficiency and security for users.

This technological reality underscores why it’s important that legislative solutions not use data location as a way of determining whether a particular country can exercise jurisdiction over a service provider. As internet providers continue to improve their global networks to better serve their users—whether they’re individuals, businesses, educational institutions or others—it’s important that the law reflects an understanding of technological innovation, and how modern distributed systems function.  

Source: Google Cloud


Our 2017 environmental report

Today, we published our updated Environmental Report, which provides data on Google's environmental sustainability programs. This report closes out 2016, a landmark year ushering in three major milestones: 10 years of carbon neutrality, 10 years for the Earth Outreach program, and reaching 100 percent renewable energy for our operations.

Last year, we marked 10 years of operating as a carbon neutral company. In 2007, we committed to aggressively pursuing energy efficiency, renewable energy, and high-quality carbon offsets. Since then, our carbon footprint has grown more slowly than our business. We’ve learned and advanced across these areas in ways we couldn’t have imagined a decade ago—and the work has proven that we can serve a growing number of users while using fewer natural resources.

Most notably, in 2017 Google will reach 100 percent renewable energy for our global operations—including both our data centers and offices. That means that we will directly purchase enough wind and solar electricity annually to account for every unit of electricity we consume, globally. This shift in our energy strategy didn’t just significantly reduce our environmental impact. By pioneering new energy purchasing models that others can follow, we’ve helped drive widescale global adoption of clean energy.

Also marking 10 years is the Earth Outreach program, which gives nonprofit groups resources, tools, and inspiration to leverage the power of Google Earth and other mapping tools for their causes. Earth Outreach is now combining machine learning and cloud computing to build a living, breathing dashboard of the planet. By turning the mountains of geo-data we have into insights and knowledge, we can help guide better decision-making in local communities and at global scale.

earth online

A major consequence of society’s “take-make-waste” economic model is climate change, one of the most significant challenges of our time. We believe Google can build tools to improve people’s lives while reducing our dependence on natural resources and fossil fuels. And we’re committed to working with others to empower everyone—businesses, governments, nonprofit organizations, communities, and individuals—to create a more sustainable world.

We’ve shared some new stories on our environment website about renewable energy in Europe and our healthy building materials tool. We also describe how these efforts can positively impact the millions of customers using Google Cloud.

Google is moving in the right direction when it comes to environmental stewardship—but there’s a lot more work to do. We’re looking ahead at the next 10 years of decreasing our impact on the earth while building technology that helps as many people as possible.