Tag Archives: google cloud

Introducing the security center for G Suite—security analytics and best practices from Google

We want to make it easy for you to manage your organization’s data security. A big part of this is making sure you and your admins can access a bird’s eye view of your security—and, more importantly, that you can take action based on timely insights.

Today, we’re introducing the security center for G Suite, a tool that brings together security analytics, actionable insights and best practice recommendations from Google to empower you to protect your organization, data and users.

With the security center, key executives and admins can do things like:

1. See a snapshot of important security metrics in one place. 

Get insights into suspicious device activity, visibility into how spam and malware are targeting users within your organization and metrics to demonstrate security effectiveness—all in a unified dashboard.

Security Center GA - 1

2. Stay ahead of potential threats. 

Admins can now examine security analytics to flag threats. For example, your team can have visibility into which users are being targeted by phishing so that you can head off potential attacks, or when Google Drive files trigger DLP rules, you have a heads up to avoid risking data exfiltration.

Security Center - 2

3. Reduce risk by adopting security health recommendations.

Security health analyzes your existing security posture and gives you customized advice to secure your users and data. These recommendations cover issues ranging from how your data is stored, to how your files are shared, as well as recommendations on mobility and communications settings.  

Security Center GA - 3

Get started

More than 3.5 million organizations rely on G Suite to collaborate securely. If you’re a G Suite Enterprise customer, you’ll be able to access the security center within the Admin console automatically in the next few days. These instructions can help admins get started and here are some security best practices to keep in mind.

If you’re new to G Suite, learn more about about how you can collaborate, store and communicate securely.

Source: Google Cloud


Introducing the security center for G Suite—security analytics and best practices from Google

We want to make it easy for you to manage your organization’s data security. A big part of this is making sure you and your admins can access a bird’s eye view of your security—and, more importantly, that you can take action based on timely insights.

Today, we’re introducing the security center for G Suite, a tool that brings together security analytics, actionable insights and best practice recommendations from Google to empower you to protect your organization, data and users.

With the security center, key executives and admins can do things like:

1. See a snapshot of important security metrics in one place. 

Get insights into suspicious device activity, visibility into how spam and malware are targeting users within your organization and metrics to demonstrate security effectiveness—all in a unified dashboard.

Security Center GA - 1

2. Stay ahead of potential threats. 

Admins can now examine security analytics to flag threats. For example, your team can have visibility into which users are being targeted by phishing so that you can head off potential attacks, or when Google Drive files trigger DLP rules, you have a heads up to avoid risking data exfiltration.

Security Center - 2

3. Reduce risk by adopting security health recommendations.

Security health analyzes your existing security posture and gives you customized advice to secure your users and data. These recommendations cover issues ranging from how your data is stored, to how your files are shared, as well as recommendations on mobility and communications settings.  

Security Center GA - 3

Get started

More than 3.5 million organizations rely on G Suite to collaborate securely. If you’re a G Suite Enterprise customer, you’ll be able to access the security center within the Admin console automatically in the next few days. These instructions can help admins get started and here are some security best practices to keep in mind.

If you’re new to G Suite, learn more about about how you can collaborate, store and communicate securely.

Cloud AutoML: Making AI accessible to every business

When we both joined Google Cloud just over a year ago, we embarked on a mission to democratize AI. Our goal was to lower the barrier of entry and make AI available to the largest possible community of developers, researchers and businesses.

Our Google Cloud AI team has been making good progress towards this goal. In 2017, we introduced Google Cloud Machine Learning Engine, to help developers with machine learning expertise easily build ML models that work on any type of data, of any size. We showed how modern machine learning services, i.e., APIs—including Vision, Speech, NLP, Translation and Dialogflow—could be built upon pre-trained models to bring unmatched scale and speed to business applications. Kaggle, our community of data scientists and ML researchers, has grown to more than one million members. And today, more than 10,000 businesses are using Google Cloud AI services, including companies like Box, Rolls Royce Marine, Kewpie and Ocado.

But there’s much more we can do. Currently, only a handful of businesses in the world have access to the talent and budgets needed to fully appreciate the advancements of ML and AI. There’s a very limited number of people that can create advanced machine learning models. And if you’re one of the companies that has access to ML/AI engineers, you still have to manage the time-intensive and complicated process of building your own custom ML model. While Google has offered pre-trained machine learning models via APIs that perform specific tasks, there's still a long road ahead if we want to bring AI to everyone.

To close this gap, and to make AI accessible to every business, we’re introducing Cloud AutoML. Cloud AutoML helps businesses with limited ML expertise start building their own high-quality custom models by using advanced techniques like learning2learn and transfer learning from Google. We believe Cloud AutoML will make AI experts even more productive, advance new fields in AI and help less-skilled engineers build powerful AI systems they previously only dreamed of.

Our first Cloud AutoML release will be Cloud AutoML Vision, a service that makes it faster and easier to create custom ML models for image recognition. Its drag-and-drop interface lets you easily upload images, train and manage models, and then deploy those trained models directly on Google Cloud. Early results using Cloud AutoML Vision to classify popular public datasets like ImageNet and CIFAR have shown more accurate results with fewer misclassifications than generic ML APIs.

Here’s a little more on what Cloud AutoML Vision has to offer:

  • Increased accuracy: Cloud AutoML Vision is built on Google’s leading image recognition approaches, including transfer learning and neural architecture search technologies. This means you’ll get a more accurate model even if your business has limited machine learning expertise.

  • Faster turnaround time to production-ready models: With Cloud AutoML, you can create a simple model in minutes to pilot your AI-enabled application, or build out a full, production-ready model in as little as a day.

  • Easy to use: AutoML Vision provides a simple graphical user interface that lets you specify data, then turns that data into a high quality model customized for your specific needs.

AutoML

Urban Outfitters is constantly looking for new ways to enhance our customers’ shopping experience," says Alan Rosenwinkel, Data Scientist at URBN. "Creating and maintaining a comprehensive set of product attributes is critical to providing our customers relevant product recommendations, accurate search results and helpful product filters; however, manually creating product attributes is arduous and time-consuming. To address this, our team has been evaluating Cloud AutoML to automate the product attribution process by recognizing nuanced product characteristics like patterns and neckline styles. Cloud AutoML has great promise to help our customers with better discovery, recommendation and search experiences."

Mike White, CTO and SVP, for Disney Consumer Products and Interactive Media, says: “Cloud AutoML’s technology is helping us build vision models to annotate our products with Disney characters, product categories and colors. These annotations are being integrated into our search engine to enhance the impact on Guest experience through more relevant search results, expedited discovery and product recommendations on shopDisney.”

And Sophie Maxwell, Conservation Technology Lead at the Zoological Society of London, tells us: "ZSL is an international conservation charity devoted to the worldwide conservation of animals and their habitats. A key requirement to deliver on this mission is to track wildlife populations to learn more about their distribution and better understand the impact humans are having on these species. In order to achieve this, ZSL has deployed a series of camera traps in the wild that take pictures of passing animals when triggered by heat or motion. The millions of images captured by these devices are then manually analysed and annotated with the relevant species, such as elephants, lions and giraffes, etc., which is a labour-intensive and expensive process. ZSL’s dedicated Conservation Technology Unit has been collaborating closely with Google’s Cloud ML team to help shape the development of this exciting technology, which ZSL aims to use to automate the tagging of these images—cutting costs, enabling wider-scale deployments and gaining a deeper understanding of how to conserve the world’s wildlife effectively."

If you’re interested in trying out AutoML Vision, you can request access via this form.

AutoML Vision is the result of our close collaboration with Google Brain and other Google AI teams, and is the first of several Cloud AutoML products in development. While we’re still at the beginning of our journey to make AI more accessible, we’ve been deeply inspired by what our 10,000+ customers using Cloud AI products have been able to achieve. We hope the release of Cloud AutoML will help even more businesses discover what’s possible through AI.

References

Cloud AutoML: Making AI accessible to every business

When we both joined Google Cloud just over a year ago, we embarked on a mission to democratize AI. Our goal was to lower the barrier of entry and make AI available to the largest possible community of developers, researchers and businesses.

Our Google Cloud AI team has been making good progress towards this goal. In 2017, we introduced Google Cloud Machine Learning Engine, to help developers with machine learning expertise easily build ML models that work on any type of data, of any size. We showed how modern machine learning services, i.e., APIs—including Vision, Speech, NLP, Translation and Dialogflow—could be built upon pre-trained models to bring unmatched scale and speed to business applications. Kaggle, our community of data scientists and ML researchers, has grown to more than one million members. And today, more than 10,000 businesses are using Google Cloud AI services, including companies like Box, Rolls Royce Marine, Kewpie and Ocado.

But there’s much more we can do. Currently, only a handful of businesses in the world have access to the talent and budgets needed to fully appreciate the advancements of ML and AI. There’s a very limited number of people that can create advanced machine learning models. And if you’re one of the companies that has access to ML/AI engineers, you still have to manage the time-intensive and complicated process of building your own custom ML model. While Google has offered pre-trained machine learning models via APIs that perform specific tasks, there's still a long road ahead if we want to bring AI to everyone.

To close this gap, and to make AI accessible to every business, we’re introducing Cloud AutoML. Cloud AutoML helps businesses with limited ML expertise start building their own high-quality custom models by using advanced techniques like learning2learn and transfer learning from Google. We believe Cloud AutoML will make AI experts even more productive, advance new fields in AI and help less-skilled engineers build powerful AI systems they previously only dreamed of.

Our first Cloud AutoML release will be Cloud AutoML Vision, a service that makes it faster and easier to create custom ML models for image recognition. Its drag-and-drop interface lets you easily upload images, train and manage models, and then deploy those trained models directly on Google Cloud. Early results using Cloud AutoML Vision to classify popular public datasets like ImageNet and CIFAR have shown more accurate results with fewer misclassifications than generic ML APIs.

Here’s a little more on what Cloud AutoML Vision has to offer:

  • Increased accuracy: Cloud AutoML Vision is built on Google’s leading image recognition approaches, including transfer learning and neural architecture search technologies. This means you’ll get a more accurate model even if your business has limited machine learning expertise.

  • Faster turnaround time to production-ready models: With Cloud AutoML, you can create a simple model in minutes to pilot your AI-enabled application, or build out a full, production-ready model in as little as a day.

  • Easy to use: AutoML Vision provides a simple graphical user interface that lets you specify data, then turns that data into a high quality model customized for your specific needs.

AutoML

Urban Outfitters is constantly looking for new ways to enhance our customers’ shopping experience," says Alan Rosenwinkel, Data Scientist at URBN. "Creating and maintaining a comprehensive set of product attributes is critical to providing our customers relevant product recommendations, accurate search results and helpful product filters; however, manually creating product attributes is arduous and time-consuming. To address this, our team has been evaluating Cloud AutoML to automate the product attribution process by recognizing nuanced product characteristics like patterns and neckline styles. Cloud AutoML has great promise to help our customers with better discovery, recommendation and search experiences."

Mike White, CTO and SVP, for Disney Consumer Products and Interactive Media, says: “Cloud AutoML’s technology is helping us build vision models to annotate our products with Disney characters, product categories and colors. These annotations are being integrated into our search engine to enhance the impact on Guest experience through more relevant search results, expedited discovery and product recommendations on shopDisney.”

And Sophie Maxwell, Conservation Technology Lead at the Zoological Society of London, tells us: "ZSL is an international conservation charity devoted to the worldwide conservation of animals and their habitats. A key requirement to deliver on this mission is to track wildlife populations to learn more about their distribution and better understand the impact humans are having on these species. In order to achieve this, ZSL has deployed a series of camera traps in the wild that take pictures of passing animals when triggered by heat or motion. The millions of images captured by these devices are then manually analysed and annotated with the relevant species, such as elephants, lions and giraffes, etc., which is a labour-intensive and expensive process. ZSL’s dedicated Conservation Technology Unit has been collaborating closely with Google’s Cloud ML team to help shape the development of this exciting technology, which ZSL aims to use to automate the tagging of these images—cutting costs, enabling wider-scale deployments and gaining a deeper understanding of how to conserve the world’s wildlife effectively."

If you’re interested in trying out AutoML Vision, you can request access via this form.

AutoML Vision is the result of our close collaboration with Google Brain and other Google AI teams, and is the first of several Cloud AutoML products in development. While we’re still at the beginning of our journey to make AI more accessible, we’ve been deeply inspired by what our 10,000+ customers using Cloud AI products have been able to achieve. We hope the release of Cloud AutoML will help even more businesses discover what’s possible through AI.

References

Source: Google Cloud


Expanding our global infrastructure with new regions and subsea cables

At Google, we've spent $30 billion improving our infrastructure over three years, and we’re not done yet. From data centers to subsea cables, Google is committed to connecting the world and serving our Cloud customers, and today we’re excited to announce that we’re adding three new submarine cables, and five new regions.

We’ll open our Netherlands and Montreal regions in the first quarter of 2018, followed by Los Angeles, Finland, and Hong Kong – with more to come. Then, in 2019 we’ll commission three subsea cables: Curie, a private cable connecting Chile to Los Angeles; Havfrue, a consortium cable connecting the U.S. to Denmark and Ireland; and the Hong Kong-Guam Cable system (HK-G), a consortium cable interconnecting major subsea communication hubs in Asia.  

Together, these investments further improve our network—the world’s largest—which by some accounts delivers 25% of worldwide internet traffic. Companies like PayPal leverage our network and infrastructure to run their businesses effectively.

“At PayPal, we process billions of transactions across the globe, and need to do so securely, instantaneously and economically. As a result, security, networking and infrastructure were key considerations for us when choosing a cloud provider,” said Sri Shivananda, PayPal’s Senior Vice President and Chief Technology Officer. “With Google Cloud, we have access to the world’s largest network, which helps us reach our infrastructure goals and best serve our millions of users.”

infrastructure-1
Figure 1. Diagram shows existing GCP regions and upcoming GCP regions
infrastructure-2
Figure 2. Diagram shows three new subsea cable investments, expanding capacity to Chile, Asia Pacific and across the Atlantic

Curie cable

Our investment in the Curie cable (named after renowned scientist Marie Curie) is part of our ongoing commitment to improve global infrastructure. In 2008, we were the first tech company to invest in a subsea cable as a part of a consortium. With Curie, we become the first major non-telecom company to build a private intercontinental cable.

By deploying our own private subsea cable, we help improve global connectivity while providing value to our customers. Owning the cable ourselves has some distinct benefits. Since we control the design and construction process, we can fully define the cable’s technical specifications, streamline deployment and deliver service to users and customers faster. Also, once the cable is deployed, we can make routing decisions that optimize for latency and availability.

Curie will be the first subsea cable to land in Chile in almost 20 years. Once deployed, Curie will be Chile’s largest single data pipe. It will serve Google users and customers across Latin America.

Havfrue cable

To increase capacity and resiliency in our North Atlantic systems, we’re working with Facebook, Aqua Comms and Bulk Infrastructure to build a direct submarine cable system connecting the U.S. to Denmark and Ireland. This cable, called Havfrue (Danish for “mermaid”), will be built by TE SubCom and is expected to come online by the end of 2019. The marine route survey, during which the supplier determines the specific route the cable will take, is already underway.

HK-G cable

In the Pacific, we’re working with RTI-C and NEC on the Hong Kong-Guam cable system. Together with Indigo and other existing subsea systems, this cable creates multiple scalable, diverse paths to Australia, increasing our resilience in the Pacific. As a result, customers will experience improved capacity and latency from Australia to major hubs in Asia. It will also increase our network capacity at our new Hong Kong region.
infrastructure-3

Figure 3. A complete list of Google’s subsea cable investments. New cables in this announcement are highlighted yellow. Google subsea cables provide reliability, speed and security not available from any other cloud.

Google has direct investment in 11 cables, including those planned or under construction. The three cables highlighted in yellow are being announced in this blog post. (In addition to these 11 cables where Google has direct ownership, we also lease capacity on numerous additional submarine cables.)

What does this mean for our customers?

These new investments expand our existing cloud network. The Google network has over 100 points of presence (map) and over 7,500 edge caching nodes (map). This investment means faster and more reliable connectivity for all our users.

Simply put, it wouldn’t be possible to deliver products like Machine Learning Engine, Spanner, BigQuery and other Google Cloud Platform and G Suite services at the quality of service users expect without the Google network. Our cable systems provide the speed, capacity and reliability Google is known for worldwide, and at Google Cloud, our customers are able to to make use of the same network infrastructure that powers Google’s own services.

While we haven’t hastened the speed of light, we have built a superior cloud network as a result of the well-provisioned direct paths between our cloud and end-users, as shown in the figure below.

infrastructure-4

Figure 4. The Google network offers better reliability, speed and security performance as compared with the nondeterministic performance of the public internet, or other cloud networks. The Google network consists of fiber optic links and subsea cables between 100+ points of presence, 7500+ edge node locations, 90+ Cloud CDN  locations, 47 dedicated interconnect locations and 15 GCP regions.

We’re excited about these improvements. We're increasing our commitment to ensure users have the best connections in this increasingly connected world.

Source: Google Cloud


Expanding our global infrastructure with new regions and subsea cables

At Google, we've spent $30 billion improving our infrastructure over three years, and we’re not done yet. From data centers to subsea cables, Google is committed to connecting the world and serving our Cloud customers, and today we’re excited to announce that we’re adding three new submarine cables, and five new regions.

We’ll open our Netherlands and Montreal regions in the first quarter of 2018, followed by Los Angeles, Finland, and Hong Kong – with more to come. Then, in 2019 we’ll commission three subsea cables: Curie, a private cable connecting Chile to Los Angeles; Havfrue, a consortium cable connecting the U.S. to Denmark and Ireland; and the Hong Kong-Guam Cable system (HK-G), a consortium cable interconnecting major subsea communication hubs in Asia.  

Together, these investments further improve our network—the world’s largest—which by some accounts delivers 25% of worldwide internet traffic. Companies like PayPal leverage our network and infrastructure to run their businesses effectively.

“At PayPal, we process billions of transactions across the globe, and need to do so securely, instantaneously and economically. As a result, security, networking and infrastructure were key considerations for us when choosing a cloud provider,” said Sri Shivananda, PayPal’s Senior Vice President and Chief Technology Officer. “With Google Cloud, we have access to the world’s largest network, which helps us reach our infrastructure goals and best serve our millions of users.”

infrastructure-1
Figure 1. Diagram shows existing GCP regions and upcoming GCP regions
infrastructure-2
Figure 2. Diagram shows three new subsea cable investments, expanding capacity to Chile, Asia Pacific and across the Atlantic

Curie cable

Our investment in the Curie cable (named after renowned scientist Marie Curie) is part of our ongoing commitment to improve global infrastructure. In 2008, we were the first tech company to invest in a subsea cable as a part of a consortium. With Curie, we become the first major non-telecom company to build a private intercontinental cable.

By deploying our own private subsea cable, we help improve global connectivity while providing value to our customers. Owning the cable ourselves has some distinct benefits. Since we control the design and construction process, we can fully define the cable’s technical specifications, streamline deployment and deliver service to users and customers faster. Also, once the cable is deployed, we can make routing decisions that optimize for latency and availability.

Curie will be the first subsea cable to land in Chile in almost 20 years. Once deployed, Curie will be Chile’s largest single data pipe. It will serve Google users and customers across Latin America.

Havfrue cable

To increase capacity and resiliency in our North Atlantic systems, we’re working with Facebook, Aqua Comms and Bulk Infrastructure to build a direct submarine cable system connecting the U.S. to Denmark and Ireland. This cable, called Havfrue (Danish for “mermaid”), will be built by TE SubCom and is expected to come online by the end of 2019. The marine route survey, during which the supplier determines the specific route the cable will take, is already underway.

HK-G cable

In the Pacific, we’re working with RTI-C and NEC on the Hong Kong-Guam cable system. Together with Indigo and other existing subsea systems, this cable creates multiple scalable, diverse paths to Australia, increasing our resilience in the Pacific. As a result, customers will experience improved capacity and latency from Australia to major hubs in Asia. It will also increase our network capacity at our new Hong Kong region.
infrastructure-3

Figure 3. A complete list of Google’s subsea cable investments. New cables in this announcement are highlighted yellow. Google subsea cables provide reliability, speed and security not available from any other cloud.

Google has direct investment in 11 cables, including those planned or under construction. The three cables highlighted in yellow are being announced in this blog post. (In addition to these 11 cables where Google has direct ownership, we also lease capacity on numerous additional submarine cables.)

What does this mean for our customers?

These new investments expand our existing cloud network. The Google network has over 100 points of presence (map) and over 7,500 edge caching nodes (map). This investment means faster and more reliable connectivity for all our users.

Simply put, it wouldn’t be possible to deliver products like Machine Learning Engine, Spanner, BigQuery and other Google Cloud Platform and G Suite services at the quality of service users expect without the Google network. Our cable systems provide the speed, capacity and reliability Google is known for worldwide, and at Google Cloud, our customers are able to to make use of the same network infrastructure that powers Google’s own services.

While we haven’t hastened the speed of light, we have built a superior cloud network as a result of the well-provisioned direct paths between our cloud and end-users, as shown in the figure below.

infrastructure-4

Figure 4. The Google network offers better reliability, speed and security performance as compared with the nondeterministic performance of the public internet, or other cloud networks. The Google network consists of fiber optic links and subsea cables between 100+ points of presence, 7500+ edge node locations, 90+ Cloud CDN  locations, 47 dedicated interconnect locations and 15 GCP regions.

We’re excited about these improvements. We're increasing our commitment to ensure users have the best connections in this increasingly connected world.

Protecting our Google Cloud customers from new vulnerabilities without impacting performance

If you’ve been keeping up on the latest tech news, you’ve undoubtedly heard about the CPU security flaw that Google’s Project Zero disclosed last Wednesday. On Friday, we answered some of your questions and detailed how we are protecting Cloud customers. Today, we’d like to go into even more detail on how we’ve protected Google Cloud products against these speculative execution vulnerabilities, and what we did to make sure our Google Cloud customers saw minimal performance impact from these mitigations.

Modern CPUs and operating systems protect programs and users by putting a “wall" around them so that one application, or user, can’t read what’s stored in another application’s memory. These boundaries are enforced by the CPU.

But as we disclosed last week, Project Zero discovered techniques that can circumvent these protections in some cases, allowing one application to read the private memory of another, potentially exposing sensitive information.

The vulnerabilities come in three variants, each of which must be protected against individually. Variant 1 and Variant 2 have also been referred to as “Spectre.” Variant 3 has been referred to as “Meltdown.” Project Zero described these in technical detail, the Google Security blog described how we’re protecting users across all Google products, and we explained how we’re protecting Google Cloud customers and provided guidance on security best practices for customers who use their own operating systems with Google Cloud services.

Surprisingly, these vulnerabilities have been present in most computers for nearly 20 years. Because the vulnerabilities exploit features that are foundational to most modern CPUs—and were previously believed to be secure—they weren’t just hard to find, they were even harder to fix. For months, hundreds of engineers across Google and other companies worked continuously to understand these new vulnerabilities and find mitigations for them.

In September, we began deploying solutions for both Variants 1 and 3 to the production infrastructure that underpins all Google products—from Cloud services to Gmail, Search and Drive—and more-refined solutions in October. Thanks to extensive performance tuning work, these protections caused no perceptible impact in our cloud and required no customer downtime in part due to Google Cloud Platform’s Live Migration technology. No GCP customer or internal team has reported any performance degradation.

While those solutions addressed Variants 1 and 3, it was clear from the outset that Variant 2 was going to be much harder to mitigate. For several months, it appeared that disabling the vulnerable CPU features would be the only option for protecting all our workloads against Variant 2. While that was certain to work, it would also disable key performance-boosting CPU features, thus slowing down applications considerably.

Not only did we see considerable slowdowns for many applications, we also noticed inconsistent performance, since the speed of one application could be impacted by the behavior of other applications running on the same core. Rolling out these mitigations would have negatively impacted many customers.

With the performance characteristics uncertain, we started looking for a “moonshot”—a way to mitigate Variant 2 without hardware support. Finally, inspiration struck in the form of “Retpoline”—a novel software binary modification technique that prevents branch-target-injection, created by Paul Turner, a software engineer who is part of our Technical Infrastructure group. With Retpoline, we didn't need to disable speculative execution or other hardware features. Instead, this solution modifies programs to ensure that execution cannot be influenced by an attacker.

With Retpoline, we could protect our infrastructure at compile-time, with no source-code modifications. Furthermore, testing this feature, particularly when combined with optimizations such as software branch prediction hints, demonstrated that this protection came with almost no performance loss.

We immediately began deploying this solution across our infrastructure. In addition to sharing the technique with industry partners upon its creation, we open-sourced our compiler implementation in the interest of protecting all users.

By December, all Google Cloud Platform (GCP) services had protections in place for all known variants of the vulnerability. During the entire update process, nobody noticed: we received no customer support tickets related to the updates. This confirmed our internal assessment that in real-world use, the performance-optimized updates Google deployed do not have a material effect on workloads.

We believe that Retpoline-based protection is the best-performing solution for Variant 2 on current hardware. Retpoline fully protects against Variant 2 without impacting customer performance on all of our platforms. In sharing our research publicly, we hope that this can be universally deployed to improve the cloud experience industry-wide.

This set of vulnerabilities was perhaps the most challenging and hardest to fix in a decade, requiring changes to many layers of the software stack. It also required broad industry collaboration since the scope of the vulnerabilities was so widespread. Because of the extreme circumstances of extensive impact and the complexity involved in developing fixes, the response to this issue has been one of the few times that Project Zero made an exception to its 90-day disclosure policy.

While these vulnerabilities represent a new class of attack, they're just a few among the many different types of threats our infrastructure is designed to defend against every day. Our infrastructure includes mitigations by design and defense-in-depth, and we’re committed to ongoing research and contributions to the security community and to protecting our customers as new vulnerabilities are discovered.

Source: Google Cloud


Protecting our Google Cloud customers from new vulnerabilities without impacting performance

If you’ve been keeping up on the latest tech news, you’ve undoubtedly heard about the CPU security flaw that Google’s Project Zero disclosed last Wednesday. On Friday, we answered some of your questions and detailed how we are protecting Cloud customers. Today, we’d like to go into even more detail on how we’ve protected Google Cloud products against these speculative execution vulnerabilities, and what we did to make sure our Google Cloud customers saw minimal performance impact from these mitigations.

Modern CPUs and operating systems protect programs and users by putting a “wall" around them so that one application, or user, can’t read what’s stored in another application’s memory. These boundaries are enforced by the CPU.

But as we disclosed last week, Project Zero discovered techniques that can circumvent these protections in some cases, allowing one application to read the private memory of another, potentially exposing sensitive information.

The vulnerabilities come in three variants, each of which must be protected against individually. Variant 1 and Variant 2 have also been referred to as “Spectre.” Variant 3 has been referred to as “Meltdown.” Project Zero described these in technical detail, the Google Security blog described how we’re protecting users across all Google products, and we explained how we’re protecting Google Cloud customers and provided guidance on security best practices for customers who use their own operating systems with Google Cloud services.

Surprisingly, these vulnerabilities have been present in most computers for nearly 20 years. Because the vulnerabilities exploit features that are foundational to most modern CPUs—and were previously believed to be secure—they weren’t just hard to find, they were even harder to fix. For months, hundreds of engineers across Google and other companies worked continuously to understand these new vulnerabilities and find mitigations for them.

In September, we began deploying solutions for both Variants 1 and 3 to the production infrastructure that underpins all Google products—from Cloud services to Gmail, Search and Drive—and more-refined solutions in October. Thanks to extensive performance tuning work, these protections caused no perceptible impact in our cloud and required no customer downtime in part due to Google Cloud Platform’s Live Migration technology. No GCP customer or internal team has reported any performance degradation.

While those solutions addressed Variants 1 and 3, it was clear from the outset that Variant 2 was going to be much harder to mitigate. For several months, it appeared that disabling the vulnerable CPU features would be the only option for protecting all our workloads against Variant 2. While that was certain to work, it would also disable key performance-boosting CPU features, thus slowing down applications considerably.

Not only did we see considerable slowdowns for many applications, we also noticed inconsistent performance, since the speed of one application could be impacted by the behavior of other applications running on the same core. Rolling out these mitigations would have negatively impacted many customers.

With the performance characteristics uncertain, we started looking for a “moonshot”—a way to mitigate Variant 2 without hardware support. Finally, inspiration struck in the form of “Retpoline”—a novel software binary modification technique that prevents branch-target-injection, created by Paul Turner, a software engineer who is part of our Technical Infrastructure group. With Retpoline, we didn't need to disable speculative execution or other hardware features. Instead, this solution modifies programs to ensure that execution cannot be influenced by an attacker.

With Retpoline, we could protect our infrastructure at compile-time, with no source-code modifications. Furthermore, testing this feature, particularly when combined with optimizations such as software branch prediction hints, demonstrated that this protection came with almost no performance loss.

We immediately began deploying this solution across our infrastructure. In addition to sharing the technique with industry partners upon its creation, we open-sourced our compiler implementation in the interest of protecting all users.

By December, all Google Cloud Platform (GCP) services had protections in place for all known variants of the vulnerability. During the entire update process, nobody noticed: we received no customer support tickets related to the updates. This confirmed our internal assessment that in real-world use, the performance-optimized updates Google deployed do not have a material effect on workloads.

We believe that Retpoline-based protection is the best-performing solution for Variant 2 on current hardware. Retpoline fully protects against Variant 2 without impacting customer performance on all of our platforms. In sharing our research publicly, we hope that this can be universally deployed to improve the cloud experience industry-wide.

This set of vulnerabilities was perhaps the most challenging and hardest to fix in a decade, requiring changes to many layers of the software stack. It also required broad industry collaboration since the scope of the vulnerabilities was so widespread. Because of the extreme circumstances of extensive impact and the complexity involved in developing fixes, the response to this issue has been one of the few times that Project Zero made an exception to its 90-day disclosure policy.

While these vulnerabilities represent a new class of attack, they're just a few among the many different types of threats our infrastructure is designed to defend against every day. Our infrastructure includes mitigations by design and defense-in-depth, and we’re committed to ongoing research and contributions to the security community and to protecting our customers as new vulnerabilities are discovered.

Reflecting on 2017: a year in review for G Suite

Before we get into the swing of the new year—which is sure to bring new projects, new teammates and new challenges—let’s take a moment to reflect on highlights from 2017.

Here’s a look at what happened in G Suite last year.

1. Bringing you the power of Google’s artificial intelligence.

Smart Reply GIF

Technology continues to change the way we work. This year, we further integrated Google’s artificial intelligence into G Suite so that you can accomplish more in less time. Using machine learning, Gmail suggests email responses. Sheets builds charts, creates pivot tables and suggests formulas. And you can also ask questions in full sentences and get instant answers in Sheets and Cloud Search (in addition to Docs and Slides) thanks to natural language processing.

2. Helping businesses secure their data.

OAuth

Protecting sensitive data and assets is a constant challenge that businesses face. Now, using contextual intelligence, Gmail can warn you if you’re responding to someone outside of your company domain. We also extended DLP to Google Drive to make it easier to secure sensitive data and control sharing. Google Vault for Drive helps surface information to support legal and compliance requirements. And we made it easier for you to manage which third-party apps can access your G Suite data.

Check out the G Suite website for more information on how you can transform your business to be security-first (or, try passing along these tips to help prevent phishing attempts).

Hangouts

3. Going all in on meetings.

We spend a lot of time on conference calls—for some, 30 percent of their day is spent in meetings—but meetings don’t often reflect how we actually like to work together. To help teams transform how they collaborate, we created a new Hangouts experience for the enterprise, designed cost-effective hardware built for the meeting room, reimagined the traditional whiteboard and introduced an intelligent communication app. Plus, Google Calendar got a makeover and you can use it on your iPad now.

4. Providing enterprise-grade solutions for collaboration and storage.

Large enterprises are often drowning in files—files that represent a company’s collective knowledge. Every strategic plan, brainstorm or financial plan is an opportunity to learn more about your business, which is why you need tools to find, organize, understand and act on that knowledge.

For years, we’ve been working to ensure that Google Drive meets enterprise needs and last year Google was recognized by Gartner as a Leader in the July 2017 Gartner Magic Quadrant for Content Collaboration Platforms. We were also recognized by Forrester as a Leader in The Forrester Wave™: for Enterprise File Sync and Share (EFSS) -  Cloud Solutions, Q4 2017 report, which published in December.

5. Building tools for marketing and sales organizations, even more integrations.

Image 4 - 2017 recap for G Suite

We built tools to help marketing and sales organizations create their best work and collaborate effectively, even with other tools that teams rely on. We launched Jamboard, announced a strategic partnership with Salesforce, opened up Gmail to your favorite business apps and integrated Hire with G Suite.

These are just some of the ways we’re helping businesses transform the way they work everyday. We’re excited to see what 2018 has to offer.


Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Source: Google Cloud


Reflecting on 2017: a year in review for G Suite

Before we get into the swing of the new year—which is sure to bring new projects, new teammates and new challenges—let’s take a moment to reflect on highlights from 2017.

Here’s a look at what happened in G Suite last year.

1. Bringing you the power of Google’s artificial intelligence.

Smart Reply GIF

Technology continues to change the way we work. This year, we further integrated Google’s artificial intelligence into G Suite so that you can accomplish more in less time. Using machine learning, Gmail suggests email responses. Sheets builds charts, creates pivot tables and suggests formulas. And you can also ask questions in full sentences and get instant answers in Sheets and Cloud Search (in addition to Docs and Slides) thanks to natural language processing.

2. Helping businesses secure their data.

OAuth

Protecting sensitive data and assets is a constant challenge that businesses face. Now, using contextual intelligence, Gmail can warn you if you’re responding to someone outside of your company domain. We also extended DLP to Google Drive to make it easier to secure sensitive data and control sharing. Google Vault for Drive helps surface information to support legal and compliance requirements. And we made it easier for you to manage which third-party apps can access your G Suite data.

Check out the G Suite website for more information on how you can transform your business to be security-first (or, try passing along these tips to help prevent phishing attempts).

Hangouts

3. Going all in on meetings.

We spend a lot of time on conference calls—for some, 30 percent of their day is spent in meetings—but meetings don’t often reflect how we actually like to work together. To help teams transform how they collaborate, we created a new Hangouts experience for the enterprise, designed cost-effective hardware built for the meeting room, reimagined the traditional whiteboard and introduced an intelligent communication app. Plus, Google Calendar got a makeover and you can use it on your iPad now.

4. Providing enterprise-grade solutions for collaboration and storage.

Large enterprises are often drowning in files—files that represent a company’s collective knowledge. Every strategic plan, brainstorm or financial plan is an opportunity to learn more about your business, which is why you need tools to find, organize, understand and act on that knowledge.

For years, we’ve been working to ensure that Google Drive meets enterprise needs and last year Google was recognized by Gartner as a Leader in the July 2017 Gartner Magic Quadrant for Content Collaboration Platforms. We were also recognized by Forrester as a Leader in The Forrester Wave™: for Enterprise File Sync and Share (EFSS) -  Cloud Solutions, Q4 2017 report, which published in December.

5. Building tools for marketing and sales organizations, even more integrations.

Image 4 - 2017 recap for G Suite

We built tools to help marketing and sales organizations create their best work and collaborate effectively, even with other tools that teams rely on. We launched Jamboard, announced a strategic partnership with Salesforce, opened up Gmail to your favorite business apps and integrated Hire with G Suite.

These are just some of the ways we’re helping businesses transform the way they work everyday. We’re excited to see what 2018 has to offer.


Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.