Author Archives: Urs Hölzle

Measuring our impact in data center communities

Over 10 years ago, we built our first data center in Oregon. And a few weeks ago we broke ground on what will be our eighth data center in the U.S., helping to run Google’s products across the country and the world.

These data centers contribute significantly to job growth and income gains at both the national and state level. Even more important are the economic contributions that Google data centers make to the communities they call home.

Today, we’re releasing a report, prepared by Oxford Economics, which details the economic impact our data centers have had in their local communities. The report concludes that, as of 2016, Google data centers generated $1.3 billion in economic activity across the US, and have generated over 11,000 jobs.

Those 11,000 jobs cause a ripple effect—people with greater financial flexibility can support the local economy, which has led to the creation of an additional 4,700 jobs. In fact, when direct, indirect and induced jobs are considered, the report finds that each Google data center job supports an additional 4.9 jobs throughout the U.S.

Last year, we became the first company of our size to purchase enough energy from sources like wind and solar to exceed the amount of electricity used by our operations around the world, including offices and data centers. This commitment to renewables has economic and environmental benefits. Oxford’s report shows that eight U.S. renewable energy generation projects—most of which are located in states where we have data centers—resulted in over $2 billion of investments, created 2,800 direct jobs, and supported 520 ongoing jobs in maintenance and operations.

What we’re most proud of, however, are the ways we invest in our local communities through workforce development and education. Our community grants program supports important local initiatives, like installing Wi-Fi on school buses for kids with long commutes, and partnering with school districts to develop student STEM programs.

We are proud of our economic impact in communities across the country, but here at Google, it’s about more than just the numbers. It’s about the people we hire and the communities where we live and work.

Meeting our match: Buying 100 percent renewable energy

A little over a year ago, we announced that we were on track to purchase enough renewable energy to match all the electricity we consumed over the next year. We just completed the accounting for Google’s 2017 energy use and it’s official—we met our goal. Google’s total purchase of energy from sources like wind and solar exceeded the amount of electricity used by our operations around the world, including offices and data centers.


What do we mean by “matching” renewable energy? Over the course of 2017, across the globe, for every kilowatt hour of electricity we consumed, we purchased a kilowatt hour of renewable energy from a wind or solar farm that was built specifically for Google. This makes us the first public Cloud, and company of our size, to have achieved this feat.


Today, we have contracts to purchase three gigawatts (3GW) of output from renewable energy projects; no corporate purchaser buys more renewable energy than we do. To date, our renewable energy contracts have led to over $3 billion in new capital investment around the world.

The road to 100 percent

We've been working toward this goal for a long time. At the outset of last year, we felt confident that 2017 was the year we'd meet it. Every year, we sign contracts for new renewable energy generation projects in markets where we have operations. From the time we sign a contract, it takes one to two years to build the wind farm or solar field before it begins producing energy. In 2016, our operational projects produced enough renewables to cover 57 percent of the energy we used from global utilities. That same year, we signed a record number of new contracts for wind and solar developments that were still under construction. Those projects began operating in 2017—and that additional output of renewable energy was enough to cover more than 100 percent of what we used during the whole year.


We say that we “matched” our energy usage because it’s not yet possible to “power” a company of our scale by 100 percent renewable energy. It’s true that for every kilowatt-hour of energy we consume, we add a matching kilowatt-hour of renewable energy to a power grid somewhere. But that renewable energy may be produced in a different place, or at a different time, from where we’re running our data centers and offices. What’s important to us is that we are adding new clean energy sources to the electrical system, and that we’re buying that renewable energy in the same amount as what we’re consuming, globally and on an annual basis.

Google's data center in Eemshaven, The Netherlands.
Google's data center in Eemshaven, The Netherlands.

Looking ahead

We’re building new data centers and offices, and as demand for Google products grows, so does our electricity load. We need to be constantly adding renewables to our portfolio to keep up. So we’ll keep signing contracts to buy more renewable energy. And in those regions where we can’t yet buy renewables, we’ll keep working on ways to help open the market. We also think every energy buyer—individuals and businesses alike—should be able to choose clean energy. We’re working with groups like the Renewable Energy Buyers Alliance and Re-Source Platform to facilitate greater access to renewably-sourced energy.


This program has always been a first step for us, but it is an important milestone in our race to a carbon-free future. We do want to get to a point where renewables and other carbon-free energy sources actually power our operations every hour of every day. It will take a combination of technology, policy and new deal structures to get there, but we're excited for the challenge. We can’t wait to get back to work.

Source: Google Cloud


Security in the cloud

Security is one of the biggest issues of our time. Countless companies and governments have lost data because of security incidents. And just one breach could cost millions in fines and lost business—and most importantly, lose customer trust.

As a result, security is increasingly top of mind for CEOs and Boards of Directors. That’s why, this week, I’ll join Google Cloud CEO Diane Greene and many of our colleagues in New York, where we’ll meet with more than 100 CEOs to discuss security in the cloud.

At its most basic level, security is a human issue. Whether performed by individuals or organizations, cybersecurity attacks are ultimately carried out by people, regardless of motive.

Often these attacks rely on exploiting human nature, such as through phishing emails. And it’s people that they ultimately affect. By some accounts, 179 million personal records were exposed just in 2017 through data breaches.

And as a human issue, security is something we can tackle together.

Leveraging the cloud to protect against threats

Cloud providers offer a vast army of experts to protect against threats—one far larger than almost any internal team a company could invest in. In fact, if businesses were to go it alone, there wouldn’t be enough security professionals in the world to adequately protect every single company and their users.

In industries from financial services to healthcare to retail, companies are relying on the automation and scale offered by the cloud to protect their data and that of their customers—allowing their employees to focus on building their business. Many are coming to the same conclusion we have: In many cases, if you’re not moving to the cloud, you’re risking your business.

Take the CPU vulnerabilities that were disclosed in January, for example. These were major discoveries; they rocked the tech industry. But for the most part, cloud customers could go about their business. Here at Google Cloud, we updated our infrastructure through Live Migration, which required no reboots, no customer downtime, and did not materially impact performance. In fact, we got calls from customers asking if we had updated our systems to protect against the vulnerabilities—because they experienced no impact.

These won’t be the last security vulnerabilities to be uncovered; humans will never write perfect code. But the cloud makes it much easier to stay on top of them. The scale of the cloud security teams that find and mitigate emerging threats, the ability to update many systems at scale, and the automation to scan, update and protect users all contribute to cloud’s unique position to keep information and people secure.


Security at Google Cloud

Security has been paramount to Google from the very beginning. (I would know!) We’ve been operating securely in the cloud for almost 20 years, and we have seven apps with more than a billion users that we protect from threats every single day, and GCP itself connects to more than a billion IPs every day. We believe that security empowers innovation—that if you put security first, everything else will follow.

Security is in the details—and we pay attention at the most granular level. We were the first to introduce SSL email by default in 2010, we created the U2F security token standard in 2014, Chrome was the first browser to support post-quantum crypto in 2016, and in 2017 we introduced Titan, a purpose-built chip to establish hardware root of trust for both machines and peripherals on cloud infrastructure. These examples show the level of depth that we go into when thinking about security, and the role we take in pushing the industry forward to stay on top of evolving threats.

In addition, Google’s Project Zero team hunts for vulnerabilities across the internet, and have been behind the discoveries of “Heartbleed” as well as the recently-discovered “Spectre” and “Meltdown.” We also provide incentives to the security community to help us look for and find security bugs through our Vulnerability Reward Program.

We know how complex the security landscape is, and we’ve spent a lot of time thinking about how to solve this tough challenge. We’ve developed principles around security that define how we build our infrastructure, how we build our products, and how we operate.

For example, we believe it’s not enough to build something and try to make it secure after the fact. Security should be fundamental to all design, not bolted on to an old paradigm. That’s why we build security through progressive layers that deliver true defense in depth, meaning our cloud infrastructure doesn’t rely on any one technology to make it secure.

Now more than ever, it’s important for companies to make security an utmost priority and take responsibility for protecting their users. That’s true for Google too. At the end of the day, any organization is accountable to people above all, and user trust is crucial to business. If we don’t get security right, we don’t have a business.

That’s one of the reasons why I’m so passionate about cloud as a means to improve security. Google has always worked to protect users across the internet. With Google Cloud, we’re extending those capabilities to help businesses protect their users as well.

In the coming days, we'll share more about how we're pushing cloud security forward. Stay tuned.

Source: Google Cloud


Security in the cloud

Security is one of the biggest issues of our time. Countless companies and governments have lost data because of security incidents. And just one breach could cost millions in fines and lost business—and most importantly, lose customer trust.

As a result, security is increasingly top of mind for CEOs and Boards of Directors. That’s why, this week, I’ll join Google Cloud CEO Diane Greene and many of our colleagues in New York, where we’ll meet with more than 100 CEOs to discuss security in the cloud.

At its most basic level, security is a human issue. Whether performed by individuals or organizations, cybersecurity attacks are ultimately carried out by people, regardless of motive.

Often these attacks rely on exploiting human nature, such as through phishing emails. And it’s people that they ultimately affect. By some accounts, 179 million personal records were exposed just in 2017 through data breaches.

And as a human issue, security is something we can tackle together.


Leveraging the cloud to protect against threats


Cloud providers offer a vast army of experts to protect against threats—one far larger than almost any internal team a company could invest in. In fact, if businesses were to go it alone, there wouldn’t be enough security professionals in the world to adequately protect every single company and their users.

In industries from financial services to healthcare to retail, companies are relying on the automation and scale offered by the cloud to protect their data and that of their customers—allowing their employees to focus on building their business. Many are coming to the same conclusion we have: In many cases, if you’re not moving to the cloud, you’re risking your business.

Take the CPU vulnerabilities that were disclosed in January, for example. These were major discoveries; they rocked the tech industry. But for the most part, cloud customers could go about their business. Here at Google Cloud, we updated our infrastructure through Live Migration, which required no reboots, no customer downtime, and did not materially impact performance. In fact, we got calls from customers asking if we had updated our systems to protect against the vulnerabilities—because they experienced no impact.

These won’t be the last security vulnerabilities to be uncovered; humans will never write perfect code. But the cloud makes it much easier to stay on top of them. The scale of the cloud security teams that find and mitigate emerging threats, the ability to update many systems at scale, and the automation to scan, update and protect users all contribute to cloud’s unique position to keep information and people secure.


Security at Google Cloud


Security has been paramount to Google from the very beginning. (I would know!) We’ve been operating securely in the cloud for almost 20 years, and we have seven apps with more than a billion users that we protect from threats every single day, and GCP itself connects to more than a billion IPs every day. We believe that security empowers innovation—that if you put security first, everything else will follow.

Security is in the details—and we pay attention at the most granular level. We were the first to introduce SSL email by default in 2010, we created the U2F security token standard in 2014, Chrome was the first browser to support post-quantum crypto in 2016, and in 2017 we introduced Titan, a purpose-built chip to establish hardware root of trust for both machines and peripherals on cloud infrastructure. These examples show the level of depth that we go into when thinking about security, and the role we take in pushing the industry forward to stay on top of evolving threats.

In addition, Google’s Project Zero team hunts for vulnerabilities across the internet, and have been behind the discoveries of “Heartbleed” as well as the recently-discovered “Spectre” and “Meltdown.” We also provide incentives to the security community to help us look for and find security bugs through our Vulnerability Reward Program.

We know how complex the security landscape is, and we’ve spent a lot of time thinking about how to solve this tough challenge. We’ve developed principles around security that define how we build our infrastructure, how we build our products, and how we operate.

For example, we believe it’s not enough to build something and try to make it secure after the fact. Security should be fundamental to all design, not bolted on to an old paradigm. That’s why we build security through progressive layers that deliver true defense in depth, meaning our cloud infrastructure doesn’t rely on any one technology to make it secure.

Now more than ever, it’s important for companies to make security an utmost priority and take responsibility for protecting their users. That’s true for Google too. At the end of the day, any organization is accountable to people above all, and user trust is crucial to business. If we don’t get security right, we don’t have a business.

That’s one of the reasons why I’m so passionate about cloud as a means to improve security. Google has always worked to protect users across the internet. With Google Cloud, we’re extending those capabilities to help businesses protect their users as well.

In the coming days, we'll share more about how we're pushing cloud security forward. Stay tuned.

Source: Google Cloud


Freedom of data movement in the cloud era

In January, we joined an amicus brief with other technology companies in a case pending before the Supreme Court involving Microsoft and the U.S. Department of Justice. The companies that joined the brief argue that Congress must act to resolve the complicated policy questions raised by the case, as Congress is best-suited to weigh the important interests of law enforcement, foreign countries, service providers and, of course, the people who use the services.

Pending legislation in the U.S. Congress—the Clarifying Lawful Overseas Use of Data (CLOUD) Act—would make important strides in addressing the issues raised in the Microsoft case by updating the decades-old Electronic Communications Privacy Act. Notably, the bill clarifies that the physical location of data is not a relevant criterion in determining the data disclosure obligations of U.S. service providers.

We wanted to share a little more information on why we think this is important and what it means for our customers and users. Modern distributed networks function in ways that do not focus on data location. As more people and businesses turn to the cloud to keep their data secure and ensure their services are dependable, infrastructure has had to grow and evolve to meet those demands. Global networks offer end users a level of dependability that previously required the most sophisticated backup technologies and significant individual hardware investment. Understanding how a global distributed network like ours works is key to understanding the benefits it offers and the challenges that are presented by laws that focus on where data is stored.

Growth of the public cloud

It’s been an important goal of Internet companies like ours to offer services that can be accessed by hundreds-of-millions of users no matter where they are. These services have to be fast, reliable, robust, and resilient. From our earliest days, it was essential that our index, with its links to vast swaths of content, be as comprehensive as possible. But beyond that, it was also critical that the service be fast. Increasing the speed of search meant a vastly improved experience for users otherwise accustomed to long load times over slow internet connections.

Through the years, we’ve worked hard to continually improve how we serve users in all corners of the world. From an infrastructure perspective, this has meant focusing on how best to route data securely, balance processing loads and storage needs, and prevent data loss, corruption, and outages.

Public cloud services operate on a global basis, using geographically distributed infrastructure to ensure that the services that run on them have maximum availability and uptime. Data typically no longer resides on a single hard drive or server rack, or even in a single data center. Instead, it must be stored, secured, and made available in a way that allows it to be accessed by the users who depend on it just as easily in India as in Germany.


Focus on the user

The way we handle data is driven by what’s best for our users, regardless of whether that user is an individual or a large enterprise. To provide them with the reliability, efficiency, resiliency, and speed they depend on, data might need to be stored in many different configurations across a global network.

Cloud infrastructure also offers business customers more control over where and how their data is stored, depending on their needs. These customers may choose to store their data in a country or data center near their corporate headquarters, or as close to their users as possible.

With customer needs in mind, cloud providers balance factors ranging from internet bandwidth, the likelihood of power outages over available networks, and network throughput. This short video explains how these considerations come to life on a distributed network, using the photo a Gmail user attaches to a message as an example.

Enhancing the security and integrity of your data

As this video explains, individual data files may be broken up into smaller pieces, stored, or moved to keep them safe and accessible. Modern internet networks increasingly transmit and store data intelligently, often moving and replicating data seamlessly between data centers and across borders in order to protect the integrity of the data and maximize efficiency and security for users.

This technological reality underscores why it’s important that legislative solutions not use data location as a way of determining whether a particular country can exercise jurisdiction over a service provider. As internet providers continue to improve their global networks to better serve their users—whether they’re individuals, businesses, educational institutions or others—it’s important that the law reflects an understanding of technological innovation, and how modern distributed systems function.  

Source: Google Cloud


Our 2017 environmental report

Today, we published our updated Environmental Report, which provides data on Google's environmental sustainability programs. This report closes out 2016, a landmark year ushering in three major milestones: 10 years of carbon neutrality, 10 years for the Earth Outreach program, and reaching 100 percent renewable energy for our operations.

Last year, we marked 10 years of operating as a carbon neutral company. In 2007, we committed to aggressively pursuing energy efficiency, renewable energy, and high-quality carbon offsets. Since then, our carbon footprint has grown more slowly than our business. We’ve learned and advanced across these areas in ways we couldn’t have imagined a decade ago—and the work has proven that we can serve a growing number of users while using fewer natural resources.

Most notably, in 2017 Google will reach 100 percent renewable energy for our global operations—including both our data centers and offices. That means that we will directly purchase enough wind and solar electricity annually to account for every unit of electricity we consume, globally. This shift in our energy strategy didn’t just significantly reduce our environmental impact. By pioneering new energy purchasing models that others can follow, we’ve helped drive widescale global adoption of clean energy.

Also marking 10 years is the Earth Outreach program, which gives nonprofit groups resources, tools, and inspiration to leverage the power of Google Earth and other mapping tools for their causes. Earth Outreach is now combining machine learning and cloud computing to build a living, breathing dashboard of the planet. By turning the mountains of geo-data we have into insights and knowledge, we can help guide better decision-making in local communities and at global scale.

earth online

A major consequence of society’s “take-make-waste” economic model is climate change, one of the most significant challenges of our time. We believe Google can build tools to improve people’s lives while reducing our dependence on natural resources and fossil fuels. And we’re committed to working with others to empower everyone—businesses, governments, nonprofit organizations, communities, and individuals—to create a more sustainable world.

We’ve shared some new stories on our environment website about renewable energy in Europe and our healthy building materials tool. We also describe how these efforts can positively impact the millions of customers using Google Cloud.

Google is moving in the right direction when it comes to environmental stewardship—but there’s a lot more work to do. We’re looking ahead at the next 10 years of decreasing our impact on the earth while building technology that helps as many people as possible.

Schlumberger chooses GCP to deliver new oil and gas technology platform

Google Cloud has a simple but steadfast mission: Give companies technology for new and better ways to serve their customers. We handle the network, computing and security chores; you use our software-defined infrastructure, global databases and artificial intelligence to grow your business with speed and at scale.

A great example of this work is our collaboration with Schlumberger, which has selected Google Cloud as its strategic provider for its clients’ digital journey to the cloud.

For over 90 years, Schlumberger has worked with clients in the oil and gas industry. In this work, Schlumberger generates and uses large amounts of data to safely and efficiently manage hydrocarbon exploration and production. Schlumberger has developed a unique software environment that runs on GCP called DELFI*, a cognitive energy and production (E&P) environment at the SIS Global Forum, which spans from exploration to production. Customers can combine DELFI with their own proprietary data and science for new insights and faster results.

Today at the Schlumberger customer event SIS Global Forum, I talked about the new ways Google Cloud and Schlumberger are working together. This unique, multi-year collaboration encompasses a range of technologies:

  • Big data: Schlumberger launched the DELFI cognitive E&P environment and the deployment of an E&P Data Lake based on Google BigQuery, Cloud Spanner and Cloud Datastore with more than 100 million data items comprised of over 30TB of petrotechnical data.

  • Software platforms: Schlumberger announced the launch of its petrotechnical software platforms such as Petrel* E&P and INTERSECT* running on Google Cloud Platform and integrated into DELFI

  • High performance computing: Since announcing our relationship at Google Cloud Next, we’ve worked together to optimize Schlumberger Omega* geophysical data processing platform to run at a scale not possible in traditional data center environments. Using Google Cloud NVIDIA GPUs and Custom Machine Types, Schlumberger has deployed compute capacity of over 35 petaflops and 10PB of storage on GCP.

  • Artificial intelligence: Schlumberger leverages TensorFlow for complex petrotechnical interpretation of seismic and wellbore data, as well as automation of well-log quality control and 3D seismic interpretation.

  • Extensibility: Schlumberger adopted the Apigee API management platform to provide openness and extensibility for its clients and for partners to add their own intellectual property and workflows in DELFI

“To improve productivity and performance, DELFI enables our customers to take advantage of our E&P domain science and knowledge, while at the same time fully using disruptive digital technologies from Google Cloud,” said Ashok Belani, Executive VP of Technology, Schlumberger “This approach ensures that all data is considered when making critical decisions.”

By running on GCP, Schlumberger’s customers can supercharge their applications, whether it’s training machine learning models on our infrastructure, or easier software development and deployment via Kubernetes and containers. We’re also building upon new collaborations with other companies like Nutanix to give Schlumberger the flexibility to run its applications wherever they need to be—on-premises and in the cloud.

Our collaboration with Schlumberger is just the beginning. We’re thrilled the team has chosen Google Cloud to help deliver security, accessibility and innovation through their next generation energy exploration and production technology.  

*Mark of Schlumberger

Schlumberger chooses GCP to deliver new oil and gas technology platform

Google Cloud has a simple but steadfast mission: Give companies technology for new and better ways to serve their customers. We handle the network, computing and security chores; you use our software-defined infrastructure, global databases and artificial intelligence to grow your business with speed and at scale.

A great example of this work is our collaboration with Schlumberger, which has selected Google Cloud as its strategic provider for its clients’ digital journey to the cloud.

For over 90 years, Schlumberger has worked with clients in the oil and gas industry. In this work, Schlumberger generates and uses large amounts of data to safely and efficiently manage hydrocarbon exploration and production. Schlumberger has developed a unique software environment that runs on GCP called DELFI*, a cognitive energy and production (E&P) environment at the SIS Global Forum, which spans from exploration to production. Customers can combine DELFI with their own proprietary data and science for new insights and faster results.

Today at the Schlumberger customer event SIS Global Forum, I talked about the new ways Google Cloud and Schlumberger are working together. This unique, multi-year collaboration encompasses a range of technologies:

  • Big data: Schlumberger launched the DELFI cognitive E&P environment and the deployment of an E&P Data Lake based on Google BigQuery, Cloud Spanner and Cloud Datastore with more than 100 million data items comprised of over 30TB of petrotechnical data.

  • Software platforms: Schlumberger announced the launch of its petrotechnical software platforms such as Petrel* E&P and INTERSECT* running on Google Cloud Platform and integrated into DELFI

  • High performance computing: Since announcing our relationship at Google Cloud Next, we’ve worked together to optimize Schlumberger Omega* geophysical data processing platform to run at a scale not possible in traditional data center environments. Using Google Cloud NVIDIA GPUs and Custom Machine Types, Schlumberger has deployed compute capacity of over 35 petaflops and 10PB of storage on GCP.

  • Artificial intelligence: Schlumberger leverages TensorFlow for complex petrotechnical interpretation of seismic and wellbore data, as well as automation of well-log quality control and 3D seismic interpretation.

  • Extensibility: Schlumberger adopted the Apigee API management platform to provide openness and extensibility for its clients and for partners to add their own intellectual property and workflows in DELFI

“To improve productivity and performance, DELFI enables our customers to take advantage of our E&P domain science and knowledge, while at the same time fully using disruptive digital technologies from Google Cloud,” said Ashok Belani, Executive VP of Technology, Schlumberger “This approach ensures that all data is considered when making critical decisions.”

By running on GCP, Schlumberger’s customers can supercharge their applications, whether it’s training machine learning models on our infrastructure, or easier software development and deployment via Kubernetes and containers. We’re also building upon new collaborations with other companies like Nutanix to give Schlumberger the flexibility to run its applications wherever they need to be—on-premises and in the cloud.

Our collaboration with Schlumberger is just the beginning. We’re thrilled the team has chosen Google Cloud to help deliver security, accessibility and innovation through their next generation energy exploration and production technology.  

*Mark of Schlumberger

Source: Google Cloud


Schlumberger chooses GCP to deliver new oil and gas technology platform

Google Cloud has a simple but steadfast mission: Give companies technology for new and better ways to serve their customers. We handle the network, computing and security chores; you use our software-defined infrastructure, global databases and artificial intelligence to grow your business with speed and at scale.

A great example of this work is our collaboration with Schlumberger, which has selected Google Cloud as its strategic provider for its clients’ digital journey to the cloud.

For over 90 years, Schlumberger has worked with clients in the oil and gas industry. In this work, Schlumberger generates and uses large amounts of data to safely and efficiently manage hydrocarbon exploration and production. Schlumberger has developed a unique software environment that runs on GCP called DELFI*, a cognitive energy and production (E&P) environment at the SIS Global Forum, which spans from exploration to production. Customers can combine DELFI with their own proprietary data and science for new insights and faster results.

Today at the Schlumberger customer event SIS Global Forum, I talked about the new ways Google Cloud and Schlumberger are working together. This unique, multi-year collaboration encompasses a range of technologies:

  • Big data: Schlumberger launched the DELFI cognitive E&P environment and the deployment of an E&P Data Lake based on Google BigQuery, Cloud Spanner and Cloud Datastore with more than 100 million data items comprised of over 30TB of petrotechnical data.

  • Software platforms: Schlumberger announced the launch of its petrotechnical software platforms such as Petrel* E&P and INTERSECT* running on Google Cloud Platform and integrated into DELFI

  • High performance computing: Since announcing our relationship at Google Cloud Next, we’ve worked together to optimize Schlumberger Omega* geophysical data processing platform to run at a scale not possible in traditional data center environments. Using Google Cloud NVIDIA GPUs and Custom Machine Types, Schlumberger has deployed compute capacity of over 35 petaflops and 10PB of storage on GCP.

  • Artificial intelligence: Schlumberger leverages TensorFlow for complex petrotechnical interpretation of seismic and wellbore data, as well as automation of well-log quality control and 3D seismic interpretation.

  • Extensibility: Schlumberger adopted the Apigee API management platform to provide openness and extensibility for its clients and for partners to add their own intellectual property and workflows in DELFI

“To improve productivity and performance, DELFI enables our customers to take advantage of our E&P domain science and knowledge, while at the same time fully using disruptive digital technologies from Google Cloud,” said Ashok Belani, Executive VP of Technology, Schlumberger “This approach ensures that all data is considered when making critical decisions.”

By running on GCP, Schlumberger’s customers can supercharge their applications, whether it’s training machine learning models on our infrastructure, or easier software development and deployment via Kubernetes and containers. We’re also building upon new collaborations with other companies like Nutanix to give Schlumberger the flexibility to run its applications wherever they need to be—on-premises and in the cloud.

Our collaboration with Schlumberger is just the beginning. We’re thrilled the team has chosen Google Cloud to help deliver security, accessibility and innovation through their next generation energy exploration and production technology.  

*Mark of Schlumberger

Source: Google Cloud


Bolstering security across Google Cloud

San Francisco — Today at Google Cloud Next ‘17, we launched the following new features for Google Cloud Platform (GCP) and G Suite that are designed to help safeguard your company’s assets and prevent disruption to your business:

  • Identity-Aware Proxy (IAP) for GCP (now in beta) allows you to manage granular access to applications running on GCP based on risk, rather than the “all-or-nothing” approach of VPN access. It provides more secure application access from anywhere, with access determined by user, identity and group. IAP is easy to deploy, and can be integrated with phishing-resistant security keys.

  • Data Loss Prevention (DLP) API for GCP (now in beta) lets you scan for more than 40 sensitive data types so you can identify and redact sensitive data. DLP does deep content analysis to help ensure that no matter what you want to keep safe, from credit cards to account numbers, you know where it is, and that it's protected at the level you want. DLP API for GCP joins DLP for Gmail and Drive, allowing admins to write policies that manage sensitive data in ways that aren’t possible on any other cloud.

DLP API
  • Key Management Service for GCP (now generally available) allows you to generate, use, rotate and destroy symmetric encryption keys for use in the cloud. It gives customers the ability to manage their encryption keys in a multi-tenant cloud service, without the need to maintain an on-premise key management system or hardware security module.

  • Security Key Enforcement (SKE) for GCP and G Suite (now generally available) allows you to require security keys be used as the two-step verification factor for stronger authentication whenever a user signs into G Suite or accesses a GCP resource. SKE is easy on admins, easy on users and hard on phishers.

security-click
  • Google Vault for Google Drive, Team Drives and Google Groups (now generally available), is the eDiscovery and compliance solution for G Suite. Vault allows customers to set retention policies, place legal holds, perform searches across Drive, Gmail, Hangouts and Groups and export search results to support your legal and compliance requirements

  • Titan is Google's purpose-built chip to establish hardware root of trust for both machines and peripherals on cloud infrastructure, allowing us to more securely identify and authenticate legitimate access at the hardware level. Purpose-built hardware such as Titan is a part of Google’s layered security architecture, spanning the physical security of data centers to secure boot across hardware and software to operational security.

next-security-titan

By baking security into everything we do and offering innovative capabilities that build upon this secure foundation, we create many different layers to prevent and defend against attacks and implement enterprise security policies so that our customers can feel confident partnering with us to achieve their business goals.