Category Archives: Google Cloud Platform Blog

Product updates, customer stories, and tips and tricks on Google Cloud Platform

HashiCorp and Google expand collaboration, easing secret and infrastructure management



Open source technology encourages collaboration and innovation to address real world problems, including projects supported by Google Cloud. As part of our broad engagement with the open source community, we’ve been working with HashiCorp since 2013 to enable customers who use HashiCorp tools to make optimal use of Google Cloud Platform (GCP) services and features.

A longstanding, productive collaboration 

Google and HashiCorp have dedicated engineering teams focused on enhancing and expanding GCP support in HashiCorp products. We're focused on technical and shared go-to-market efforts around HashiCorp products in several critical areas of infrastructure.

  • Cloud provisioning: The Google Cloud provider for HashiCorp Terraform allows management of a broad array of GCP resource types, with Bigtable and BigQuery being the most recent additions. Today, HashiCorp also announced support for GCP in the Terraform Module Registry to give users easy access to templates for setting up and running their GCP-based infrastructure. We plan to continue to broaden the number of GCP services that can be provisioned with Terraform, allowing Terraform users to adopt a familiar workflow across multiple cloud and on-premises environments. Using Terraform to move workloads to GCP simplifies the cloud adoption process for Google customers that use Terraform today in cross-cloud environments. 
  • Cloud security and secret management: We're working to enhance the integration between HashiCorp Vault and GCP, including Vault authentication backends for IAM and signed VM metadata. This is in addition to work being done by HashiCorp for Kubernetes authentication. 

Using HashiCorp Vault with Google Cloud and Kubernetes 

Applications often require access to small pieces of sensitive data at build or run time, referred to as secrets. HashiCorp Vault is a popular open source tool for secret management, which allows a developer to store, manage and control access to tokens, passwords, certificates, API keys and other secrets. Vault has many options for authentication, known as authentication backends. These allow developers to use many kinds of credentials to access Vault, including tokens, or usernames and passwords.

As of today, developers on Google Cloud now have two authentication backends which they can use to validate a service’s identity to their instance of Vault: 


With these authentication backends, it’s easier for a particular service running on Google Cloud to get access to a secret it needs at build or run time stored in Vault.

Fleetsmith is a secure cloud-based solution for managing a company’s Mac computers, that fully integrates with G Suite. They’ve been testing out the new Compute Engine metadata backend, and are currently using Vault on GCP for PKI and secret management. Learn more about how Fleetsmith did this in their blogpost.

“Fleetsmith and Google have shared values when it comes to security, and we built our product on Google Cloud Platform in part due to Google's high bar for security. We're excited about this new integration because it strengthens the security model for us as Google Cloud customers using Vault.” 
 Jesse Endahl, CPO and CSO, Fleetsmith 

If you’re using Vault for managing secrets in Kubernetes specifically, today HashiCorp announced a new Kubernetes authentication backend. This uses Kubernetes pod service accounts to authenticate to Vault, providing an alternative to storing secrets in directly in `etcd`.


Running HashiCorp Vault on Google Cloud 


You may already be running your own instance of HashiCorp Vault. Users can run Vault in either Compute Engine or Google Container Engine, and then use one of our new authentication backends to authenticate to Vault.

WePay, an online payment service provider, uses HashiCorp Vault on GCP:
 "Managing usernames, passwords and certificates is a challenge in a microservice world, where we have to securely manage many secrets for hundreds of microservices. WePay chose to use HashiCorp Vault to store secrets because it provides us with rotation, tight control and out-of-the-box audit logging for our secrets and other sensitive data. WePay runs Vault server infrastructure on Google Compute Engine for secret storage, key management and service to service authentication, for use by our microservice architecture based on Google Container Engine."  
 Akshath Kumar, Site Reliability Engineer, WePay 
eBay also uses HashiCorp Vault on GCP:
“As a strong contributor and supporter of free open source software with vital projects such as regressr and datameta, eBay is a user of Hashicorp’s software products, including vaultproject.io on the Google Cloud Platform.”  
 Mitch Wyle, Director of Applied Science and Engineering, eBay 

Today, we’re publishing a solution on how to best set up and run HashiCorp Vault on Compute Engine. For best practices for running Vault on Compute Engine, read the solution brief “Using Vault on Compute Engine for Secret Management”.


Using HashiCorp Terraform to manage your resources on Google Cloud 

When you’re testing new code or software, you might want to spin up a test environment to simulate your application. HashiCorp Terraform is an infrastructure management and deployment tool that allows you to programmatically configure infrastructure across a variety of providers, including cloud providers like Google Cloud.

Using Terraform on Google Cloud, you can programmatically manage projects, IAM policies, Compute Engine resources, BigQuery datasets and more. To get started with Terraform for Google Cloud, check out the Terraform Google Cloud provider documentation, take a look at our tutorial for managing GCP projects with Terraform, which you can follow on our community page, or watch our Terraform for Google Cloud demo.

Google has released a number of Terraform modules that make working with Google Cloud even easier. These modules let you quickly compose your architectures as code and reuse architectural patterns for resources like load balancing, managed instance groups, NAT gateways and SQL databases. The modules can be found on the Terraform Module Registry.

Get involved 

We’re always excited about new contributors to open source projects we support. If you’d like to contribute, please get involved in projects like Kubernetes, istio, as well as Vault and Terraform. The community is what makes these projects successful. To learn more about open source projects we support, see Open Source at Google.

GCP arrives in South America with launch of São Paulo region!


Read this post in Portuguese. A Nova Região GCP de São Paulo está abertaa
Read this post in Spanish. La Nueva Región GCP de San Pablo está abierta

We’re pleased to announce that the São Paulo region is now open to the public as southamerica-east1. This is our first Google Cloud Platform (GCP) region in South America, and it promises to significantly improve latency for GCP customers and end users in the area. Performance testing shows 80% to 95% reductions in round-trip time (RTT) latency are possible when your serve customers in Chile, Argentina and Brazil compared to using other GCP regions in the U.S. GCP customers are able to build applications and store data* in Brazil as well as make payments in Brazilian Reais.

Services 

We’ve launched São Paulo with three zones and the following services. You can combine any of the services you deploy in São Paulo with other GCP services around the world such as Data Loss Prevention, Cloud Spanner and BigQuery.

What customers are saying

“With the arrival of the Google Cloud Platform region in Brazil, Dotz sees the potential of boosting its business entirely. We are excited with the new opportunities we can take advantage of with the opening of the new region, leveraging the current use of the tools we are working on in GCP.”  
Cristiano Hyppolito, CTO of Dotz
 "We’re excited that GCP will be offering soon a region in São Paulo. Contabilizei has been a GCP client since 2013, and our fast growth was possible thanks to GCP tools. We believe that this launch will improve our service performance, and will contribute to the growth of the Latin American community of Google Cloud Platform users. The launch of the São Paulo GCP region will continue to support us in order for Contabilizei to continue delivering services 90% cheaper than traditional accountants."  
Fabio Bacarin, CTO Contabilizei
“The majority of our clients and partners are in Brazil. The launch of the Google Cloud Platform region in São Paulo will reduce the latency of its products, and with this take down the last barrier so we can massively use the Google Cloud for services that interface with our clients.”  
Flavio Tooru, Movile

Getting started 

For help migrating to GCP, please contact any of the following local partners: Alest, iPnet, SantoDigital, Safetec, UOL,QiNetwork. For additional details on the São Paulo region, please visit our São Paulo region page where you’ll get access to free resources, whitepapers, an on-demand video series called "Cloud On-Air" and more. Our locations page provides updates on the availability of additional services and regions. Contact us to request early access to new regions and help us prioritize what we build next.

*Please visit our Service Specific Terms to get detailed information on our data storage capabilities.

*******************************************************************************

A Nova Região GCP de São Paulo está aberta

O Google Cloud Platform chegou à América do Sul! A região de São Paulo já está aberta como southamerica-east1. Esta é a primeira região do Google Cloud Platform (GCP) na América do Sul e promete reduzir muito a latência para os clientes do GCP e usuários finais nessa área.

Com a região de São Paulo, a experiência dos clientes do GCP na América do Sul ficou melhor do que nunca. Agora, os clientes do GCP de todo o continente terão a oportunidade de armazenar e processar dados localmente no Brasil, além de comprar direto de uma entidade local em reais. Os testes de desempenho mostram reduções de 80% a 95% na latência do tempo de ida e volta (TIV) para clientes no Chile, na Argentina e no Brasil em comparação com outras regiões de GCP nos Estados Unidos.
O que os clientes estão dizendo:

Os clientes estavam ansiosos pelo lançamento da região do Google Cloud Platform na América Latina.
 "Com a chegada da região do Google Cloud Platform no Brasil, a Dotz vê a possibilidade de potencializar os negócios. Estamos empolgados com as novas possibilidades que poderemos aproveitar com a abertura da nova região, alavancando o uso atual das ferramentas que usamos no GCP."  
Cristiano Hyppolito, CTO of Dotz
“Estamos muito felizes com o lançamento de uma região do GCP em São Paulo. A Contabilizei é cliente do GCP desde 2013, e nosso crescimento tão rápido só foi possível graças às ferramentas do GCP. Com certeza, esse lançamento vai melhorar o desempenho dos nossos serviços e vai contribuir para o crescimento da comunidade latino-americana de usuários do Google Cloud Platform. Com a região do GCP em São Paulo, a Contabilizei poderá oferecer serviços 90% mais baratos que as empresas de contabilidade tradicionais.” 
 Fabio Bacarin - CTO Contabilizei
 "A maioria dos nossos clientes e parceiros estão no Brasil. O lançamento da região do Google Cloud Platform em São Paulo vai diminuir a latência dos produtos e, com isso, derrubar a última barreira para podermos utilizar o Google Cloud de forma massiva nos serviços que fazem interface com nossos clientes." 
Flavio Tooru, Movile

Servicios

Lançamos São Paulo com três zonas e os seguintes serviços:
Além disso, você pode combinar qualquer um dos serviços que você implanta em São Paulo com outros serviços GCP em todo o mundo, como DLP, Spanner e BigQuery.

Próximos passos

Se precisar de ajuda para implementar o GCP, entre em contato com o nosso time de vendas.

Para saber mais sobre a nova região, acesse o portal da região de São Paulo, que traz recursos gratuitos, fichas técnicas, vídeos da série 'Cloud On-Air' e muito mais. Esses materiais vão ajudar você a começar a trabalhar com o GCP. Além disso, na nossa página de regiões, você pode encontrar novidades sobre as regiões que estarão disponíveis em breve. Fale conosco para solicitar acesso antecipado a novas regiões e nos ajudar a priorizar nossos próximos passos.

* Por favor, visite nossos Termos Específicos de Serviços para obter informações detalhadas sobre nossa capacidade de armazenamento de dados.

 ********************************************************************************

La Nueva Región GCP de San Pablo está abierta

¡Google Cloud Platform ha llegado a Sudamérica! La región de San Pablo está abierta ahora cómo southamerica-east1. Esta es nuestra primera región de Google Cloud Platform (GCP) en América del Sur, y promete mejorar significativamente la latencia para los clientes de GCP y los usuarios finales en el área.

Con la región de San Pablo, la experiencia de los clientes de GCP en Sudamérica es mejor que nunca. Por primera vez, la nueva región de Brasil ofrece a los clientes de GCP en toda Sudamérica, la oportunidad de almacenar y procesar datos localmente en Brasil. Las pruebas de rendimiento muestran reducciones del 80% al 95% en la latencia del tiempo de ida y vuelta (TIV) al servir a clientes en Chile, Argentina y Brasil en comparación con el uso de otras regiones de GCP en los Estados Unidos.

Comentarios de los clientes

Los clientes han estado ansiosos por el lanzamiento de la Google Cloud Platform región en Latinoamérica. .
“Con la llegada de la región basada en Brasil, Dotz ve el potencial de aumentar todos sus negocios. Estamos emocionados con las nuevas oportunidades que podríamos aprovechar con la apertura de la nueva región, aprovechando el uso actual de las herramientas de GCP que ya estamos utilizando.”  
Cristiano Hyppolito, CTO of
“Estamos emocionados porque GCP ofrecerá una región en San Pablo. Contabilizei ha sido cliente desde el 2013, y nuestro rápido crecimiento fue posible gracias a las herramientas de GCP. Creemos que este lanzamiento mejorará el desempeño de nuestros servicios y obtener precios más competitivos, 90% más baratos que los de contadores tradicionales.” 
 Fabio Bacarin - CTO Contabilizei
“La mayoría de nuestros clientes y socios están en Brasil. El lanzamiento de la región de Google Cloud Platform en San Pablo disminuirá la latencia de sus productos, y con esto podremos derribar la última barrera para utilizar masivamente el Google Cloud para servicios que interactúan con nuestros clientes.” 
 Flavio Tooru - Movile

Servicios

Hemos lanzado San Pablo con tres zonas y los siguientes servicios:


Además, puede combinar cualquiera de los servicios que implementa en San Pablo con otros servicios de GCP en todo el mundo, como DLP, Spanner y BigQuery.

Próximos pasos

Si está buscando ayuda para implementar GCP, póngase en contacto con nuestro equipo de ventas.

Para más detalles sobre la región de San Pablo, por favor visite nuestro portal de la región de San Pablo, donde obtendrá acceso a recursos gratuitos, papeles blancos, la serie de videos 'Cloud On-Air' y más. Estos materiales le ayudarán a empezar a trabajar con GCP. En nuestra página de ubicaciones encontrará las próximas actualizaciones sobre otras regiones. Contáctenos para solicitar acceso temprano a nuevas regiones y ayudarnos a priorizar lo que construimos a continuación.

 *Por favor, visite nuestros Términos Específicos del Servicio para obtener información detallada acerca de nuestras capacidades de almacenamiento de datos.

Read between the lines with Cloud Natural Language’s new recognition features



From documents to blog posts, emails to social media updates, there’s never been more ways to connect via the written word. For businesses, this can present both a challenge and an opportunity. With such a proliferation of communication channels, how do businesses stay responsive? More importantly, how can they derive useful insights from all of their content?

That’s where Google Cloud Natural Language API comes in. Cloud Natural Language enables businesses to extract critical information from their written data. And today we’re launching two new features that can help businesses further organize their content and better understand how their users feel.

Here’s a little more on what these new features can do.

Automatically classify content 


Through predefined content classification, Cloud Natural Language can now automatically sort documents and content into more than 700 different categories, including Arts & Entertainment, Hobbies & Leisure, Law & Government, News, Health, and more. This makes it ideal for industries like media and publishing who’ve traditionally had to manually sort, label and categorize content. Through machine learning with Cloud Natural Language, these companies can now automatically parse the meaning of their articles and content to organize them more efficiently.

To showcase the granularity of content classification, we analyzed top stories from the The New York Times API with Cloud Natural Language. This lobster salad recipe was categorized not only as “Cooking & Recipes” but also as “Meat & Seafood.” You can read more examples on our machine learning blog.
Hearst, one of the largest mass media publishers in the world, uses Cloud Natural Language Processing in their content management system to automatically tag entities in articles and will be using categories such as sports, entertainment, technology and more. Natural language processing adds an intelligence layer to their newsrooms, that will allow editors to understand what their audience is reading and how their content is being used. For example, Hearst now has granular visibility into how specific entities (people, places and things) trend across all their properties including daily newspapers such as the San Francisco Chronicle. This insight will help editors keep a finger on the pulse of their readers and better inform them when deciding what or who to cover in the news.
"In the newsroom, precision and speed are critical to engaging our readers. Google Cloud Natural Language is unmatched in its accuracy for content classification. At Hearst, we publish several thousand articles a day across 30+ properties and, with natural language processing, we're able to quickly gain insight into what content is being published and how it resonates with our audiences." 
Naveed Ahmad, Senior Director of Data, Hearst
Content classification is available in beta for all Cloud Natural Language users.

Analyze sentiment of entities


Sentiment analysis is one of Cloud Natural Language’s most popular features. Now, it offers more granularity with entity sentiment analysis. Rather than analyze the sentiment of a sentence or block of text, users can now parse the sentiment of places or things.

Leveraging Entity Sentiment Analysis, Motorola analyzes customer sentiment about its products across multiple sources such as Twitter, online community forums, and customer service emails. The insight helps Motorola quickly turn feedback into actionable results and increase customer satisfaction. Motorola uses Cloud Natural Language alongside its in-house natural language algorithms to get richer, more granular understanding of its customers to better serve them. Cloud Natural Language also offered a short learning curve and was easily integrated within its existing framework, without any downtime.

Entity sentiment analysis is now generally available for all Cloud Natural Language users.

These new features will help even more businesses use machine learning to get the most from their data. For more information, visit our website or sign up for a trial at no charge.

More secure hybrid cloud deployments with Google Cloud Endpoints



The shift from on-premises to cloud computing is rarely sudden and rarely complete. Workloads move over time; in some cases new workloads get built in the cloud and old workloads stay on-premises. In other cases, organizations lift and shift some services and continue to do new developments on their own infrastructure. And, of course, many companies have deployments in multiple clouds.

When you run services across a wide array of resources and locations, you need to secure communications between them. Networking may be able to solve some issues, but it can be difficult in many cases: if you're running containerized workloads on hardware that belongs to three different vendors, good luck setting up a VPN to protect that traffic.

Increasingly, our customers use Google Cloud Endpoints to authenticate and authorize calls to APIs rather than (or even in addition to) trying to secure them through networking. In fact, providing more security for calls across a hybrid environment was one of the original use cases for Cloud Endpoints adopters.
"When migrating our workloads to Google Cloud Platform, we needed to more securely communicate between multiple data centers. Traditional methods like firewalls and ad hoc authentication were unsustainable, quickly leading to a jumbled mess of ACLs. Cloud Endpoints, on the other hand, gives us a standardized authentication system." 
 Laurie Clark-Michalek, Infrastructure Engineer, Qubit 
Cloud Endpoints uses the Extensible Service Proxy, based on NGINX, which can validate a variety of authentication schemes from JWT tokens to API keys. We deploy that open source proxy automatically if you use Cloud Endpoints on App Engine Flexible environment, but it is also available via the Google Container Registry for deployment anywhere: on Google Container Engine, on-premises, or even in another cloud.

Protecting APIs with JSON Web Tokens 


One of the most common and more secure ways to protect your APIs is to require a JSON Web Token (JWT). Typically, you use a service account to represent each of your services, and each service account has a private key that can be used to sign a JSON Web Token.

If your (calling) service runs on GCP, we manage the key for you automatically; simply invoke the IAM.signJwt method on your JSON web token and put the resulting signed JWT in the OAuth Authorization: Bearer header on your call.

If your service runs on-premises, install ESP as a sidecar that proxies all traffic to your service. Your API configuration tells ESP which service account will be placing the calls. ESP uses the public key for your service account to validate that it was signed properly, and validates several fields in the JWT as well.

If the service is on-premises and calling to the cloud, you still need to sign your JWT, but it’s your responsibility to manage the private key. In that case, download the private key from Cloud Console (following best practices to help securely store it) and sign your JWTs.

For more details, check out the sample code and documentation on service-to-service authentication (or this, if you're using gRPC).


Securing APIs with API keys 


Strictly speaking, API keys are not authentication tokens. They're longer-lived and more dangerous if stolen. However, they do provide a quick and easy way to protect an API by easily adding them to a call either in a header or as a query parameter.

API keys also allow an API’s consumers to generate their own credentials. If you’ve ever called a Google API that doesn’t involve personal data, for example the Google Maps Javascript API, you’ve used an API key.

To restrict access to an API with an API key, follow these directions. After that, you’ll need to generate a key. You can generate the key in that same project (following these directions). Or you can share your project with another developer. Then, in the project that will call your API, that developer can create an API key and enable the API. Add the key to the API calls as a query parameter (just add ?key=${ENDPOINTS_KEY} to your request) or in the x-api-key header (see the documentation for details).

Wrapping up 


Securing APIs is good practice no matter where they run. At Google, we use authentication for inter-service communication, even if both run entirely on our production network. But if you live in a hybrid cloud world, authenticating each and every call is even more important.

To get started with Cloud Endpoints, take a look at our tutorials. It’s a great way to build scalable and more secure applications that can span a variety of cloud and on-premises environments.

With Forseti, Spotify and Google release GCP security tools to open source community


Being able to secure your cloud resources at scale is important for all Google Cloud Platform users. To help ensure the security of GCP resources, you need to have the right tools and processes in place. Spotify and Google Cloud worked together to develop innovative security tools that help organizations protect GCP projects, and have made them available in an open source community called Forseti Security. Forseti is now open to all GCP users!

For this blog post, we talked with Spotify about their experience working with the Google team to develop tools for the GCP security community. The Spotify team will also be presenting about their experience with Forseti today at the SEC-T information security conference in Stockholm.

Q: How did Forseti get started? 

When we moved our back-end data infrastructure from in-house data centers to the cloud, we began by evaluating the tools that GCP offers to help us develop securely in the cloud. Once we had a handle on that, we wanted to build some specific tools that would help us automate security processes so that our engineering team could develop freely, but securely.

In parallel to our efforts, Google had developed their own GCP security tools and was interested in bringing them to the open source community. Both of our security teams wanted to contribute our ideas to the bigger picture, and it made sense to collaborate rather than each company writing their own tools. This is how the Forseti open source idea was born.

Q. What is Forseti? 

Forseti is an open source toolkit designed to help give security teams the confidence and peace of mind that they have the appropriate security controls in place across GCP. Today, Forseti features a number of useful security tools:

  • Inventory: provides visibility into existing GCP resources 
  • Scanner: validates access control policies across GCP resources 
  • Enforcer: removes unwanted access to GCP resources 
  • Explain: analyzes who has what access to GCP resources 


Q: How does Forseti help keep your GCP environment more secure? 

Forseti gives us visibility into the GCP infrastructure that we didn’t have before, and we use it to help make sure we have the right controls in place and stay ahead of the game. It helps keep us informed about what’s going on in our environment so that we can quickly find out about any risky misconfigurations so they can be fixed right away. These tools allow us to create a workflow that puts the security team in a proactive stance rather than a reactive one. We can inform everyone involved on time rather than waiting for an incident to happen.

With the Inventory tool, we get ongoing snapshots of our GCP resources which provides an audit trail of any changes.This visibility allows us to give our developers a lot of freedom, and enables us to investigate any potential incidents.

Scanner helps us detect misconfigurations and security issues. It greatly reduces risk and saves us a ton of time. As soon as we see a violation from Scanner, we ping the team in charge of the affected resource so they can make the necessary fix. This way, security only needs to get involved if the dev team needs help.

Q: How have you put Forseti into practice so far at Spotify? 

We want our security culture to promote operational ownership by the dev team. Our team strives to be a business enabler, rather than a blocker to getting things done. This approach has allowed us to educate engineering and raise their security awareness. We believe it’s been influential in helping the dev teams become more security-conscious.

Using Forseti, we’ve been able to create a notification pipeline that proactively informs us about risky misconfigurations in GCP. This process is a major time saver for us.

Here’s how it works:
  • We run scans on our resources, and if a violation is found, it triggers our notification pipeline. 
  • Once the violation is parsed, we retrieve ownership information about the affected resource. This is like a phonebook that tells us which team is responsible, and then pings them automatically.
  • Engineering acknowledges the notification and then books a fix. 
  • We run inventory the next day to make sure the fix was completed. The security team gets involved only if the dev team is unable to resolve the issue on their own. 


Q: Why take an open source approach? 

The Forseti community is all about teamwork. It allows us to work with big and small companies who, at the end of the day, need to accomplish the same things. With this combined community expertise, we’ve identified areas where companies can make the most risky mistakes in configuring GCP, and executed on those areas first. We determined what should be in Forseti as a team, rather than as individual companies.

Different organizations often share the same risks, but have unique perspectives. When we collaborate with other organizations, the possibilities are multiplied exponentially and it helps everyone operate more securely. It also allows us to put security processes in place faster than we could do it individually. Forseti is all about about sharing ideas and collaborating, which are the ideals of open source. The benefit is to not reinvent the wheel; with Forseti we can divide and conquer — the more we are, the more we can do.

Interested in joining the Forseti security community? Get started here

Read more about Foresti on the Spotify Labs blog.

Introducing managed SSL for Google App Engine



We’re excited to announce the beta release of managed SSL certificates at no charge for applications built on Google App Engine. This service automatically encrypts server-to-client communication   an essential part of safeguarding sensitive information over the web. Manually managing SSL certificates to ensure a secure connection is a time consuming process, and GCP makes it easy for customers by providing SSL systematically at no additional charge. Managed SSL certificates are offered in addition to HTTPS connections provided on appspot.com.

Here at Google, we believe encrypted communications should be used everywhere. For example, in 2014, the Search team announced that the use of HTTPS would positively impact page rankings. Fast forward to 2017 and Google is a Certificate Authority, establishing HTTPS as the default behavior for App Engine, even across custom domains.

Now, when you build apps on App Engine, SSL is on by default   you no longer need to worry about it or spend time managing it. We’ve made using HTTPS simple: map a domain to your app, prove ownership, and App Engine automatically provisions an SSL certificate and renews it whenever necessary, at no additional cost. Purchasing and generating certificates, dealing with and securing keys, managing your SSL cipher suites and worrying about renewal dates   those are all a thing of the past.
 "Anyone who has ever had to replace an expiring SSL certificate for a production resource knows how stressful and error-prone it can be. That's why we're so excited about managed SSL certificates in App Engine. Not only is it simple to add encryption to our custom domains programmatically, the renewal process is fully automated as well. For our engineers that means less operational risk." 
 James Baldassari, Engineer, mabl

Get started with managed SSL/TLS certificates 


To get started with App Engine managed SSL certificates, simply head to the Cloud Console and add a new domain. Once the domain is mapped and your DNS records are up to date, you’ll see the SSL certificate appear in the domains list. And that’s it. Managed certificates is now the default behavior   no further steps are required!
To switch from using your own SSL certificate on an existing domain, select the desired domain, then click on the "Enable managed security" button. In just minutes, a certificate will be in place and serving client requests.

You can also use the gcloud CLI to make this change:

$ gcloud beta app domain-mappings update DOMAIN --certificate-management 'AUTOMATIC'

Rest assured that your existing certificate will remain in place and communication will continue as securely as before until the new certificate is ready and swapped in.

For more details on the full set of commands, head to the full documentation here.

Domains and SSL Certificates Admin API GA 

We’re also excited to announce the general availability of the App Engine Admin API to manage your custom domains and SSL certificates. The addition of this API enables more automation so that you can easily scale and configure your app according to the needs of your business. Check out the full documentation and API definition.

If you have any questions or concerns, or if something is not working as you’d expect, you can post in the Google App Engine forum, log a public issue, or get in touch on the App Engine slack channel (#app-engine).

4 steps for hardening your Cloud Storage buckets: taking charge of your security



This post is the second in a new “taking charge of your security” series, providing advice and best practices for ensuring security in the cloud. Check out the first post in the series, “Help keep your Google Cloud service account keys safe.” 

Cloud storage is well-suited to many use cases, from serving data, to data analytics, to data archiving. Here at Google Cloud, we work hard to make Google Cloud Storage the best and safest repository for your sensitive data: For example, we run on a hardened backend infrastructure, monitor our infrastructure for threats and automatically encrypt customer data at rest.

Nevertheless, as more organizations use various public cloud storage platforms, we hear increasingly frequent reports of sensitive data being inadvertently exposed. It’s important to note that these “breaches” are often the result of misconfigurations that inadvertently grant access to more users than was intended. The good news is that with the right tools and processes in place, you can help protect your data from unintended exposure.

Security in the cloud is a shared responsibility, and as a Cloud Storage user, we’re here to help you with some tips on how to set up appropriate access controls, locate sensitive data and do your part to help keep data more secure with tools included in Google Cloud Platform (GCP).

  1. Check for appropriate permissions

    The first step to securing a Cloud Storage bucket is to make sure that only the right individuals or groups have access. By default, access to Cloud Storage buckets is restricted, but owners and admins often make the buckets or objects public. While there are legitimate reasons to do this, making buckets public can open avenues for unintended exposure of data, and should be approached with caution.

    The preferred method for controlling access to buckets and objects is to use Identity and Access Management (IAM) permissions. IAM allows you to implement fine-grained access control to your storage buckets right out of the gate. Learn how to manage access to Cloud Storage buckets with this how-to guide. Just be sure that you understand what permissions you are granting to which users or groups. For example, granting access to a group that contains a large number of users can create significant unintended exposure. You can also use Cloud Resource Manager to centrally manage and control your projects and resources.


  2. Check for sensitive data

    Even if you’ve set the appropriate permissions, it’s important to know if there’s sensitive data stored in a Cloud Storage bucket. Enter Cloud Data Loss Prevention (DLP) API. The DLP API uses more than 40 predefined detectors to quickly and scalably classify sensitive data elements such as payment card numbers, names, personal identification numbers, telephone numbers and more. Here’s a how-to guide that teaches you how to inspect your GCS buckets using DLP API.


  3. Take Action

    If you find sensitive data in buckets that are shared too broadly, you should take appropriate steps to resolve this quickly. You can:
    • Make the public buckets or objects private again
    • Restrict access to the bucket (see Using IAM)
    • Remove the sensitive file or object from the bucket
    • Use the Cloud DLP API to redact sensitive content

    You should also avoid naming storage buckets which may contain sensitive data in a way that reveals their contents.


  4. Stay vigilant!

    Protecting sensitive data is not a one-time exercise. Permissions change, new data is added and new buckets can crop up without the right permissions in place. As a best practice, set up a regular schedule to check for inappropriate permissions, scan for sensitive data and take the appropriate follow-up actions.
Tools like IAM and DLP help make it easy to secure your data in the cloud. Watch this space for more ways to prevent unintended access, automate data protection and protect other GCP datastores and assets.

Celebrating Programmer’s Day: pride, perseverance and patient parents



From mobile phones to the internet, technology has revolutionized our lives — and we have programmers to thank for it. And, fortunately, now we can. Falling on the 256th day of the year, Programmer’s Day recognizes the works of programmers everywhere, from code hobbyists to technological innovators.

It should come as no surprise that Programmer’s Day has a special place in our hearts. To celebrate, this year we asked our engineers to tell us about the first program they wrote that they were proud of. We learned a lot. And, as it turns out, we discovered that many of our engineers have very patient parents. Read on.

Not surprisingly, many Googlers started out by writing their own games. Jen Tong wrote a text adventure game in the mid-’90s. “It was 20,000 lines of QBASIC spaghetti code. Recognizing that IF and GOTO were enough to write any program, I disregarded my father's advice to use loops and subroutines.” But there was an unexpected upside to her efforts. “Terrible as it was, it inspired several of my friends to sling their own code.”

Todd Kerpelman, a Firebase Developer Advocate, created his first in BASIC for the Apple IIe. “It had everything,” he says. “A blocky-looking dragon, a blocky-looking sword, and a booby-trapped room where you had to avoid blocky-looking blocks that fell from the ceiling. I'm pretty sure 30% of my code was GOTO statements.”
Artist's rendition of the fearsome dragon! Or possibly an angry frog. We're not really sure.
Jonathan Rochelle, Director, Product Management, also used BASIC to create his own game. “[It was] an interactive graphic which had a space ship shooting downward at growing sticks. It was on an Atari 800 using the joystick and was not at all fun to play after the first 30 seconds.” But although technical sophistication wasn’t always in reach, the possibilities excited our engineers.

The “first person driving game” Product Manager Eric Anderson created was essentially a square car and a horizon line, but he could see the potential. “It took me forever to create obstacles (more squares) that came towards my car. Once that was working, I was filled with excitement. The possibilities! I got my car moving left to right. Then I adjusted speed (the rate at which obstacles moved down from horizon). Then I added colors, wheels on my car, and the shape of the obstacles. I couldn't stop!”

We also learned programmers are a pretty dedicated bunch when it comes to seeing their inventions come to life. Global Field Marketing Program Manager Sowmya Ramakrishnan wrote her own C++ code at 15 to sort numbers. “Back then laptops didn't exist,” she tells us, “and we got one hour in a week to access computers in our school lab. Three of us shared one computer. So we would write the code in our notebooks — yes, the paper ones! — before the lab and use the time in the lab to type out the code on borland editor. It was super exciting to see a black and white monitor display the string of sorted numbers!”

Eric Ness, Technical Program Manager, Site Reliability Engineering, cobbled together an 8080 based S-100 microcomputer by buying up used computer boards and repairing them. “Back in those days, most students needed to go to the computer center to use a terminal but there were also dial-up lines available. I wanted to do my computer assignments from the comfort of my own apartment so I wrote a terminal emulator program in 8080 assembly language. My program did a reasonable job of emulating a VT-100 and could also echo the data to a printer. This program was a big time saver during my last year of school.”

Some young programmers used their budding skills to assist their parents. Douglas Dollars, Product Roadmap Program Manager, wrote an application launcher for his mom using AppleScript on an LC III. “Apple’s usability was no match for six large icons offering direct access to jigsaw puzzles and photos.”

Of course, sometimes our dedicated programmers were perhaps a bit too dedicated for their parents’ likes. “In third grade my uncle gave us his used Macintosh SE, showed me how to use QuickBASIC to make my own games and gave me a copy of ‘Creating Adventure Games On Your Computer,’” explains Brad Svee, Head of Solutions, Americas West, GCP. “I followed the guide, made my own adventure game, but changed a lot of how the game flow worked, and had to translate some of the example code from regular BASIC to QB. The last game I made was huge and amazing, and I was super proud of myself — but that was short lived. My game had somehow managed to overwrite a large portion of the 20mb hard drive, corrupting components of the OS and overwriting my mom's TurboTax files. Many tears and $1000+ later, she got some of her files back and the computer worked again, but I was relegated back to the Commodore-64 for any programming I wanted to do.”

Bill Prin, Developer Programs Engineer, came up with a creative way to convince his mom he was using his computer time for constructive uses. “I was trying to learn to make games with QBasic, but my mom limited me to an hour a day on the family computer since she thought I was just playing games. So I made a program that filled the screen with ‘I Love You, Mom!’ in different colors and sizes, which made her lift the restriction. Then my computer privileges were promptly revoked again when I deleted her dissertation draft trying to install Linux."

Here are more stories from our Google Engineers. We’d like to thank everyone who contributed their memories. Happy Programmer’s Day, everyone. See you next year.

New German GCP region – open now



We’re excited to announce that the first Google Cloud Platform (GCP) region in Germany is now open for business. You can now choose the German region to build applications and store data.*

The new German region, europe-west3, joins europe-west1 in Belgium, offering another location in continental Europe and making it easier to build highly available, performant applications using resources across both regions.

Hosting applications in europe-west3 can improve latency by up to 50% for end users in Germany, Switzerland, Austria and eastern Europe, compared to hosting them in Belgium. European companies east of Frankfurt should see better app performance when using the German region. 


Services 


The new German region joins existing sales and support offices in Germany, and is launching with three zones and the following services:


Customer benefits 


Companies in Germany have already expressed excitement for this region, including our close partner SAP and customers like REWE Digital, Klöckner, Kärcher, MeisterLabs and Sovanta AG.

“Due to its outstanding performance and global scale, Google Cloud is the ideal solution for our business. We see Google Cloud as a leader in the area of analytics and machine learning, and we have already successfully built early proof of concepts with the Google Cloud Vision API.” 
– Dr. Daniel Heubach, VP Digital Transition

 “Rewe Digital benefits greatly from Google Cloud’s first-class, enterprise-ready infrastructure, paired with powerful products such as Google Container Engine and Google’s commitment to open source technologies.” 
– Dr. Robert Zores, Geschäftsführer Technologie
“Google Cloud is a strong growth partner for our startup. and using Google Compute Engine to manage our underlying infrastructure means that we can focus on the development and marketing of our products Mindmeister & MeisterTask.” 
– Till Vollmer, Gründer and CEO
For more details on this new region, visit our German region page or visit our locations page for updates on other regions coming online. Give us a shout to request early access to new regions and help us prioritize them. We’re excited to see what you’ll build with the help of the new German region.

 *Please visit our Service Specific Terms to get detailed information on our data storage capabilities.

*********


Neue Google Cloud Region in Deutschland eröffnet 



Wir freuen uns sehr, dass wir für Google Cloud Platform die Eröffnung der ersten Region in Deutschland bekannt geben können. Ab jetzt ist es möglich, für die Erstellung von Anwendungen und Apps sowie für die Speicherung von Daten die Region Deutschland auszuwählen.*

Mit der neuen Region in Deutschland, europe-west3, gibt es neben europe-west1 in Belgien nun einen weiteren Standort auf dem europäischen Festland, über den jederzeit leistungsstarke Anwendungen einfacher erstellt werden können. Dabei werden die Ressourcen beider Regionen genutzt. In der Region europe-west3 können Unternehmen aus Deutschland, die Schweiz, Österreich und Osteuropa ihre Anwendungen hosten und erreichen so im Vergleich zur Region in Belgien eine Verringerung der Netzwerklatenz um bis zu 50 Prozent. Europäische Unternehmen, die sich östlich von Frankfurt befinden und die Google-Cloud-Platform-Region in Deutschland nutzen, sollten auch eine bessere Leistung für ihre Anwendungen erhalten.

Leistungen 

Neben den bereits bestehenden Google-Büros, die Vertrieb und Kundenservice vor Ort anbieten, ist Google Cloud Plattform nun auch mit einer Region in Deutschland vor Ort. Die Region ist in drei Zonen unterteilt und bietet folgende Leistungen:

Vorteile für Kunden 

Unternehmen in Deutschland, wie unser enger Partner SAP und Kunden wie REWE Digital, Klöckner, Kärcher, MeisterLabs und Sovanta AG haben schon länger ihr großes Interesse an dieser Google-Cloud-Platform-Region geäußert.

 „Die Google Cloud ist aufgrund der herausragenden Leistung und der globalen Ausrichtung die perfekte Lösung für unser Unternehmen. Im Bereich Analyse und maschinelles Lernen ist die Google Cloud unserer Meinung nach führend und mit der Google Cloud Vision API haben wir bereits erfolgreich Machbarkeitsprüfungen erstellt.“  
– Dr. Daniel Heubach, VP Digital Transition 

 „REWE Digital profitiert sehr von der erstklassigen Infrastruktur der Google Cloud, den dazugehörigen leistungsfähigen Produkten wie der Google Container Engine und von Googles Bekenntnis zu Open-Source-Technologien.“ 
– Dr. Robert Zores, Geschäftsführer Technologie

 „Google Cloud ist für das Wachstum unseres Start-up-Unternehmens ein starker Partner und mit der Google Compute Engine für die Verwaltung unserer zugrunde liegenden Infrastruktur können wir uns für unsere Kunden ganz auf die Entwicklung und Vermarktung unserer Produkte Mindmeister und MeisterTask konzentrieren.“ 
– Till Vollmer, Gründer & CEO  

Weitere Informationen zur neuen GCP-Region erhalten Sie auf unserer Seite für die Region Deutschland und zur Eröffnung weiterer Regionen auf der Seite für alle Standorte. Schreiben Sie uns, wenn Sie als Erster Zugang zu neuen Regionen erhalten möchten und unterstützen Sie uns so dabei, diese vorrangig zu entwickeln. Wir sind gespannt, was Sie mit Hilfe der neuen GCP-Region in Deutschland alles erstellen werden. Bis dann!

 *Bitte beachten Sie die spezifischen Bedingungen zu den einzelnen Services (Service Specific Terms), um detaillierte Informationen über unsere Möglichkeiten zur Datenspeicherung zu erfahren.

How to build a website on Google App Engine using a headless CMS like ButterCMS



Are you a web developer who creates websites and a different team, usually the marketing team, manages the content? Traditionally, there have been two ways to do that — neither of them great.

On the one hand, you could build the website using any number of Content Management Systems (CMS) and their arcane plugins. The problem with this approach is that you’re stuck having to maintain two systems — the CMS and the web framework you used to develop the main website.

Or, you anticipated that problem and roll your own homegrown CMS. The problem there is, obviously, even if you summoned your Ninja skills to replicate the features of a mature CMS in short order, you're now running two systems — for the website and the one for the CMS. A headless CMS lets you sidestep this dilemma entirely. It retrieves content using simple API calls, making it easy to integrate that content into your website while leveraging your existing style assets. Compare that to a traditional CMS where you have to rely on templates and plugins to assemble webpages. By decoupling the content from the website, your marketing team can now update the content without you needing to change templates and redeploy the site every time.

Google App Engine is a great complement to this headless approach, since it makes it easy to deploy an app built with any modern web framework and scale it automatically. This lets you focus on writing code rather than on managing infrastructure. When the content goes viral, you get to sit back and watch the site scale automatically, and enjoy the accolades on a job well done.

ButterCMS ("Butter") is a headless CMS delivered as Software as a Service (SaaS), and it’s a great option for building your next website in this way. Butter has client libraries for most popular web frameworks along with a REST API and an easy-to-use UI for content editors. Brandon Nicoll, a software engineer at Butter, recently wrote a step-by-step guide showing how to create, manage and integrate content for an example online store using Node.js, and then deploy and scale it with App Engine.

The tutorial shows you how to manage the content for the online store in Butter, and later retrieve it using the Butter API. It illustrates a good practice of encapsulating the implementation details of communicating with Butter in a separate content service. The website communicates with this content service to retrieve and render the content.
This design makes the system extensible and maintainable. For example, you could combine additional content from other sources or swap out Butter entirely for some other system, simply by changing the content service. Both the example website and the content service are deployed with App Engine.

It’s an elegant solution that shows how to use your existing web development skills, and impress your web developer colleagues by quickly building a complex, content-rich website designed in a service-oriented fashion. Your marketing colleagues, meanwhile, get a fully featured portal to manage their content, and see it scale automatically as the website goes viral. Happy developing!