Tag Archives: Infrastructure

Assess the security of Cloud deployments with InSpec for GCP

InSpec-GCP version 1.0 is now generally available, and two new Chef InSpec™ profiles have been released under an open source software license. The InSpec profiles contain controls for the GCP Center for Internet Security (CIS) Benchmark version 1.1.0 and the Payment Card Industry Data Security Standard (PCI DSS) version 3.2.1.

The Cloud Security Challenge

Developers are embracing automated continuous integration and continuous delivery (CI/CD), committing many application and infrastructure changes frequently. But centralized security teams can't review every application and infrastructure change. Those teams might have to block deployments (which decreases velocity and undermines continuous delivery) or review changes in production, where misconfigurations are more harmful and changes are more expensive.

Security reviews need to "shift left,” earlier in the software development lifecycle. Security teams likewise need to shift their own efforts to defining policies and providing tools to automate how compliance is verified. When developers adopt these tools, security and compliance checks become part of CI/CD, in a similar fashion to unit, functional, and integration tests, and thus become a normal part of the development workflow. Empowering developers to participate in this process means organizations can achieve continuous compliance. This also reinforces the mindset that security is everyone's responsibility.

What is InSpec

InSpec is a popular DevSecOps framework that checks the configuration state of resources in virtual machines and containers, on cloud providers such as GCP, AWS, and Azure. InSpec's lightweight nature, approachable domain-specific language, and extensibility make it a valuable tool for:
  • Expressing compliance policies as code
  • Enabling development teams to add tests that assess their applications' compliance with security policies before pushing changes to build and release pipelines
  • Automating compliance verification in CI/CD pipelines and as part of the release process
  • Unifying compliance assessments across multiple cloud providers and on-premises environments

InSpec for GCP and compliance profiles

The InSpec GCP resource pack 1.0 provides a consistent way to audit GCP resources. This release unifies the user experience by adding consistent behavior between resources and documentation for available fields. This resource pack also adds support for GCP endpoints that let you audit fields that are in beta (for example, GKE cluster pod security policy configuration).

You can use the GCP CIS Benchmark and the PCI DSS InSpec profiles to assess compliance with CIS and PCI DSS policies. CIS Benchmarks are configuration guides used by governments, businesses, industry, and academia. We strongly recommend configuring the workloads to meet or exceed these standards. PCI DSS is required for all organizations that accept or process credit card payments. The Terraform PCI Starter, coupled with the PCI InSpec profile, allows deployment of PCI-compliant environments and verifies their ongoing compliance.

This work is released under an open source license and we look forward to your feedback and contributions.

Validating PCI DSS and CIS compliance in infrastructure build pipelines

You can use InSpec to validate infrastructure deployments for compliance with standards such as PCI DSS and CIS. An automated validation process of new builds is important to detect insecure and non-compliant configurations as early as possible while minimizing the impact on developer agility.

With Cloud Build you can create CI pipelines for infrastructure-as-code deployments. You can run InSpec as an additional build step against resources in the GCP project to detect compliance violations in the target infrastructure. While this method doesn't prevent non-compliant build configurations, it does detect compliance issues, fail the build execution, and log the error in Cloud Logging. Cloud Build publishes build messages to a Cloud Pub/Sub topic, which can trigger a Cloud Function to integrate with appropriate alerting systems in case of a failed build. To prevent non-compliant infrastructure in a production environment, run the pipeline in a staging environment before promoting the content to production.

Here is an example pipeline definition for Cloud Build, using InSpec, to validate a project against the PCI guidelines. To run the PCI profile from a container inside a Cloud Build pipeline, clone the Git repository Payment Card Industry Data Security Standard (PCI DSS) version 3.2.1, build the Docker container from the root directory of the repository using the Dockerfile, and push the image to the Google Container Registry. The Cloud Build pipeline will store InSpec reports in a predefined bucket in json and html formats.

Here's an example for executing the PCI DSS InSpec profile as a step in a Cloud Build pipeline:

#...Previous execution steps
#
    - id: 'Run PCI Profile on in-scope project'
        waitFor: ['Write InSpec input file']
        name: gcr.io/${_GCR_PROJECT_ID}/inspec-gcp-pci-profile:v3.2.1-3
        entrypoint: '/bin/sh'
args:
    - '-c'
    - |
        inspec exec /share/. -t gcp:// \
        --input-file /workspace/inputs.yml \
        --reporter cli json:/workspace/pci_report.json \
        html:/workspace/pci_report.html | tee out.json


Note that in this example a previous execution step writes all required input parameters into the file /workspace/inputs.yml to make them available to the InSpec run. A CI/CD pipeline has been implemented for the PCI-GKE-Blueprint using Cloud Build and can be referenced as an example.

Try it yourself

Ready to try InSpec? Use this Cloud Shell Walkthrough to quickly install InSpec in your Cloud Shell instance and scan infrastructure in your GCP projects against the CIS Benchmark:


Chances are that in the walkthrough the InSpec scan detected some misconfigurations in your project.

As a developer of the project, you now know how to quickly scan your deployments, and you can begin to learn more about configuring your resources securely. Our Cloud Foundation Toolkit provides Terraform and Deployment Manager templates for best-practice configurations of your projects and underlying resources.

Most large organizations have platform teams that can adopt our Cloud Foundation Toolkit templates, which automate well-configured resource provisioning, and make those available to their developers. These organizations can also include InSpec testing steps in their CI/CD pipelines to provide early feedback to developers and to prevent misconfigured resources from getting released to Production.

By Bakh Inamov – Security and Compliance Specialist Engineer, Sam Levenick – Software Engineer, and Konrad Schieban – Infrastructure Cloud Consultant

Our Los Angeles cloud region is open for business



Hey, LA — the day has arrived! The Los Angeles Google Cloud Platform region is officially open for business. You can now store data and build highly available, performant applications in Southern California.

Los Angeles Mayor Eric Garcetti said it best: “Los Angeles is a global hub for fashion, music, entertainment, aerospace, and more—and technology is essential to strengthening our status as a center of invention and creativity. We are excited that Google Cloud has chosen Los Angeles to provide infrastructure and technology solutions to our businesses and entrepreneurs.”

The LA cloud region, us-west2, is our seventeenth overall and our fifth in the United States.

Hosting applications in the new region can significantly improve latency for end users in Southern California, and by up to 80% across Northern California and the Southwest, compared to hosting them in the previously closest region, Oregon. You can visit www.gcping.com to see how fast the LA region is for you.

Services


The LA region has everything you need to build the next great application:

Of note, the LA region debuted with one of our newest products: Cloud FilestoreBETA, our managed file storage service for applications that require a filesystem interface and a shared filesystem for data.

The region also has three zones, allowing you to distribute apps and storage across multiple zones to protect against service disruptions. You can also access our multi-regional services (such as BigQuery) in the United States and all the other GCP services via our Google Network, and combine any of the services you deploy in LA with other GCP services around the world. Please visit our Service Specific Terms for detailed information on our data storage capabilities.

Google Cloud Network

Google Cloud’s global networking infrastructure is the largest cloud network as measured by number of points of presence. This private network provides a high-bandwidth, highly reliable, low-latency link to each region across the world. With it, you can reach the LA region as easily as any region. In addition, the global Google Cloud Load Balancing makes it easy to deploy truly global applications.

Also, if you’d like to connect to the Los Angeles region privately, we offer Dedicated Interconnect at two locations: Equinix LA1 and CoreSite LA1.

LA region celebration

We celebrated the launch of the LA cloud region the best way we know how: with our customers. At the celebration, we announced new services to help content creators take advantage of the cloud: Filestore, Transfer Appliance and of course, the new region itself, in the heart of media and entertainment country. The region’s proximity to content creators is critical for cloud-based visual effects and animation workloads. With proximity comes low latency, which lets you treat the cloud as if it were part of your on-premises infrastructure—or even migrate your entire studio to the cloud.
Paul-Henri Ferrand, President of Global Customer Operations, officially announces the opening of our Los Angeles cloud region.


What customers are saying


“Google Cloud makes the City of Los Angeles run more smoothly and efficiently to better serve Angelenos city-wide. We are very excited to have a cloud region of our own that enables businesses, big or small, to leverage the latest cloud technology and foster innovation.”
- Ted Ross, General Manager and Chief Information Officer for City of LA Information Technology Agency, City of LA

“Using Google Cloud for visual effects rendering enables our team to be fast, flexible and to work on multiple large projects simultaneously without fear of resource starvation. Cloud is at the heart of our IT strategy and Google provides us with the rendering power to create Oscar-winning graphics in post-production work.”
- Steve MacPherson, Chief Technology Officer, Framestore

“A lot of our short form projects pop up unexpectedly, so having extra capacity in region can help us quickly capitalize on these opportunities. The extra speed the LA region gives us will help us free up our artists to do more creative work. We’re also expanding internationally, and hiring more artists abroad, and we’ve found that Google Cloud has the best combination of global reach, high performance and cost to help us achieve our ambitions.”
- Tom Taylor, Head of Engineering, The Mill

What SoCal partners are saying


Our partners are available to help design and support your deployment, migration and maintenance needs.

“Cloud and data are the new equalizers, transforming the way organizations are built, work and create value. Our premier partnership with Google Cloud Platform enables us to help our clients digitally transform through efforts like app modernization, data analytics, ML and AI. Google’s new LA cloud region will enhance the deliverability of these solutions and help us better service the LA and Orange County markets - a destination where Neudesic has chosen to place its corporate home.”
- Tim Marshall, CTO and Co-Founder, Neudesic

“Enterprises everywhere are on a journey to harness the power of cloud to accelerate business objectives, implement disruptive features, and drive down costs. The Taos and Google Cloud partnership helps companies innovate and scale, and we are excited for the new Google Cloud LA region. The data center will bring a whole new level of uptime and service to our Southern California team and clients.”
- Hamilton Yu, President and COO, Taos

“As a launch partner for Google Cloud and multi-year recipient of Google’s Partner of the Year award, we are thrilled to have Google’s new cloud region in Los Angeles, our home base and where we have a strong customer footprint. SADA Systems has a track record of delivering industry expertise and innovative technical services to customers nationwide. We are excited to leverage the scale and power of Google Cloud along with SADA’s expertise for our clients in the Los Angeles area to continue their cloud transformation journey.”
- Tony Safoian, CEO & President, SADA Systems

Getting started


For additional details on the LA region, please visit our LA region page where you’ll get access to free resources, whitepapers, the "Cloud On-Air" on-demand video series and more. Our locations page provides updates on the availability of additional services and regions. Contact us to request early access to new regions and help us prioritize where we build next.

Now, you can deploy your Node.js app to App Engine standard environment



Developers love Google App Engine’s zero-config deployments, zero server management and auto-scaling capabilities. At Google Cloud, our goal is to help you be more productive by supporting more popular programming languages. Starting today, you can now deploy your Node.js 8 applications to App Engine standard environment. App Engine is a fully-managed application platform that lets you deploy web and mobile applications without worrying about the underlying infrastructure.

Support for Node.js in App Engine standard environment brings a number of benefits:
  • Fast deployments and automatic scaling - With App Engine standard environment, you can expect short deployment times. For example, it takes under a minute to deploy a basic Express.js application. Further, your Node.js apps will scale instantaneously depending on web traffic; App Engine automatically scales to zero when there are no incoming requests and quickly scales up the number of instances when traffic increases.
  • Idiomatic developer experience - When designing the new runtime, we focused on providing a delightful and idiomatic developer experience. For example, the new Node.js runtime has no language or API restrictions. You can use your favorite Node.js modules, including native ones, by simply declaring your npm dependencies in your package.json, and App Engine installs them in the cloud after deploying your app. Out of the box, you will find your application logs and key performance indicators in Stackdriver. Finally, the base image contains the OS packages you need to run headless Chrome, which you can easily control using the Puppeteer module. Read more in the documentation.
  • Strong security - With our automated one-click certificate generation, you can serve your application under a secure HTTPS URL with your own custom domain. In addition, we take care of security updates so that you don't have to, automatically updating the operating system and Node.js minor and patch versions.

Node.js and Google Cloud Platform

We’ve also crafted our Node.js client libraries so you can easily use Google Cloud Platform (GCP) products from your Node.js application. For example, Cloud Datastore is a great match for App Engine, and you can easily set up live production debugging or tracing by importing the modules. These client libraries are made possible by direct code contributions that our engineers make to Node.js.

Of course, Google's relationship with Node.js goes beyond GCP: Node.js is based on V8, Google's open source high-performance JavaScript engine. And as of last year, Google is a Platinum Sponsor of the Node.js foundation.

Try it now

Node.js has become a very popular runtime environment, and App Engine customers are excited by its availability on the platform.

"Once we deploy to node.js standard, we never have to manage that deployment again. It is exactly the kind of minimal configuration, zero maintenance experience we love about the rest of App Engine."
- Ben Kraft, senior engineer, Khan Academy.
“Node.js has offered Monash University a very flexible framework to build and develop rapid prototypes and minimal viable products that provide our stakeholders and users with scalable solutions for their needs. The launch of Node.js on App Engine standard has added the bonus of being a fully managed platform, ensuring our teams can focus their efforts on developing products.”
-Eric Jiang, Monash University

App Engine is ready and waiting to host your Node.js apps, with very minimal changes. You can even try it out using the App Engine free tier—just follow our Quickstart to learn how to deploy your app, or check out this short video:



Opening a third zone in Singapore



When we opened the Google Cloud Platform (GCP) Singapore region last year,  it launched with two zones. Today, we’re happy to announce a third zone (asia-southeast1-c) and a few new services. This expansion will make it easier for customers, especially in Southeast Asia, to build highly available services that meet the needs of their business.



This is the 46th GCP zone globally, and now all 15 GCP regions have three or more zones. We build every region with the intention of providing at least three zones because we understand the importance of high availability. Customers can distribute their apps and storage across multiple zones to protect against service disruptions.

New services
At launch, the Singapore region had a core set of services and we’ve continued to add services like Cloud KMS and Cloud Bigtable. Now, we’ve added three new services to the region: Cloud Spanner, Cloud SQL, and Managed Instance Groups.



What customers are saying

“It’s super exciting to see the third availability zone go up in Singapore, as more GCP services will be provisioned closer to ASEAN. This will help ensure our customers have the best experience and reliability on our web or mobile products.”
— Nikunj Shanti Patel , Chief Data and Digital Officer

“A year ago we selected Google Cloud as our provider for BBM. A year later, we've migrated BBM to Google's Cloud platform and will leverage the third zone in Singapore to bring Google's innovation closer to our user base in Indonesia."
— Mohan Krishnan, CTO, of Creative Media Works, the company that runs BBM Messenger Consumer globally.

"With services such as Cloud SQL being made available, the third zone in Singapore will enable us to deliver the best viewing experience to our massive user base in this region. Since our engineering team is also located here, we can leverage the new services and bring further innovation to our platform at a faster pace."
— Alex Chan,  SVP of Product and Engineering, Viki

Resources

For the latest on availability of services from this region as well as additional regions and services, visit our locations page. For guidance on how to build and create highly available applications, take a look at our zones and regions page. Watch this webinar to learn more about how we bring GCP closer to you. Give us a shout to request early access to new regions and help us prioritize what we build next.

We’re excited to see what you’ll build next on GCP!

GCP is building a region in Zürich



Click here for the German version. Danke!


Switzerland is a country famous for pharmaceuticals, manufacturing and banking, and its central location in Europe makes it an attractive location for cloud. Today, we’re announcing a Google Cloud Platform (GCP) region in Zürich to make it easier for businesses to build highly available, performant applications. I am originally from Switzerland, so this cloud infrastructure investment is personally exciting for me.

Zürich will be our sixth region in Europe, joining our future region in Finland, and existing regions in the Netherlands, Belgium, Germany, and the United Kingdom. Overall, the Swiss region brings the total number of existing and announced GCP regions around the world to 20—with more to come!

The Swiss region will open in the first half of 2019. Customers in Switzerland will benefit from lower latency for their cloud-based workloads and data, and the region is also designed for high availability, launching with three zones to protect against service disruptions.

We look forward to welcoming you to the GCP Swiss region, and we’re excited to see what you build with our platform. Our locations page provides updates on the availability of additional services and regions. Contact us to request early access to new regions and help us prioritize what we build next.

Measuring our impact in data center communities

Over 10 years ago, we built our first data center in Oregon. And a few weeks ago we broke ground on what will be our eighth data center in the U.S., helping to run Google’s products across the country and the world.

These data centers contribute significantly to job growth and income gains at both the national and state level. Even more important are the economic contributions that Google data centers make to the communities they call home.

Today, we’re releasing a report, prepared by Oxford Economics, which details the economic impact our data centers have had in their local communities. The report concludes that, as of 2016, Google data centers generated $1.3 billion in economic activity across the US, and have generated over 11,000 jobs.

Those 11,000 jobs cause a ripple effect—people with greater financial flexibility can support the local economy, which has led to the creation of an additional 4,700 jobs. In fact, when direct, indirect and induced jobs are considered, the report finds that each Google data center job supports an additional 4.9 jobs throughout the U.S.

Last year, we became the first company of our size to purchase enough energy from sources like wind and solar to exceed the amount of electricity used by our operations around the world, including offices and data centers. This commitment to renewables has economic and environmental benefits. Oxford’s report shows that eight U.S. renewable energy generation projects—most of which are located in states where we have data centers—resulted in over $2 billion of investments, created 2,800 direct jobs, and supported 520 ongoing jobs in maintenance and operations.

What we’re most proud of, however, are the ways we invest in our local communities through workforce development and education. Our community grants program supports important local initiatives, like installing Wi-Fi on school buses for kids with long commutes, and partnering with school districts to develop student STEM programs.

We are proud of our economic impact in communities across the country, but here at Google, it’s about more than just the numbers. It’s about the people we hire and the communities where we live and work.

A new partnership to drive renewable energy growth in the U.S.

In our global search to find renewable energy for our data centers, we’ve long wanted to work with the state of Georgia. Solar is abundant and cost-competitive in the region, but until now the market rules did not allow companies like ours to purchase renewable energy. We’re pleased to announce that in partnership with Walmart, Target, Johnson & Johnson, and Google, the state of Georgia has approved a new program that would allow companies to buy renewable energy directly through the state’s largest utility, Georgia Power.

Through this program, Google will procure 78.8 megawatts (MW) of solar energy for our Douglas County, Georgia data center, as part of our effort to utilize renewable energy in every market where we operate. As we build and expand data centers and offices to meet growing demand for Google’s products, we constantly add renewable energy to our portfolio to match 100 percent of our energy use.

This program, the first of its kind in Georgia, greenlights the construction of two solar energy projects with a total capacity of 177MW. When these new projects become operational in 2019 and 2020, participating customers like us will be able to substitute a portion of our electricity bill with a fixed price matched to the production of renewable energy generated. This shows that providing a cost-competitive, fixed-price clean power option is not only good for the environment, it also makes business sense.

What we’ve accomplished in partnership with Georgia Power and other major corporate energy buyers in the region is a testament to the important role that businesses can play in unlocking access to renewable energy. We collaborated for over two years to help build this program, which passes the costs directly to corporate buyers, while adding more low-cost, renewable electricity to the state’s energy mix. This arrangement, and others like it throughout the country, help companies and utilities meet their renewable energy goals.

The program is a promising step forward as utilities begin to meet the growing demand for renewables by businesses everywhere. Today’s announcement shows how companies and utilities can work together to make that option available to all customers, regardless of varying energy needs.

And this is happening in other parts of the U.S. as well. We just broke ground on our new data center in Alabama and through a partnership with the Tennessee Valley Authority, we’ll be able to scout new wind and solar projects locally and work with TVA to bring new renewable energy onto their electrical grid.

As we expand our data centers across the U.S. and globally, we will keep working with new partners to help make this a cost-effective choice available to everyone.

Coming home to Alabama

Editor’s Note:Google is starting construction on our newest data center in Jackson County, Alabama. The new site marks a $600 million investment for our company and will bring as many as 100 high-skilled jobs to the community. This is part of Google’s expansion to 14 new data centers and offices across the country. Today, our head of global technology partnerships for Google Cloud, Dr. Nan Boden, spoke at the groundbreaking in Widows Creek, the site of a former coal-fired power plant where her father once worked.

Data centers are the engine of the internet. They help make technological advances around the world not only possible, but accessible to billions of people who use cloud services. Every day, more people are coming online, asking and answering big questions, and identifying new opportunities and solutions to bring about change.

Google_0111 (1).jpg

At the groundbreaking in Jackson County 

I help build global partnerships for Google Cloud, and we depend on our data centers to ensure that large companies, small businesses, students, educators, nonprofit organizations and individuals can access key services and tools in a fast and reliable way. 

Today, I participated in the groundbreaking of our newest data center in my home state of Alabama. I was born in Sheffield, raised in Athens and am a proud University of Alabama alum. My family roots run deep with the Tennessee Valley Authority (TVA)—both my late father and grandfather were career TVA electricians. My father’s job at TVA gave me and my family a better life, and his personal focus on education created an even greater path to opportunity for me. 

That’s why I’m so proud that Google can help bring that same opportunity—for education and employment opportunities—to families here in Jackson County. As part of our commitment to this community, Google will donate $100,000 to the Jackson County School District for the growth and development of the region's student STEM programs.

With the new data center, Jackson County will help deliver information to people all over the world. From an infrastructure perspective, this means focusing on how to best route data securely, reliably, and quickly. And that takes energy.

Since the 1960s, Widows Creek has generated energy for this region, and now we will use the plant’s many electric transmission lines to power our new data center. Thanks to our partnership with the TVA, we’ll be able to scout new wind and solar projects locally and work with TVA to bring new renewable energy onto their electrical grid. Ultimately, this helps Google to continue to purchase 100% renewable energy for our growing operations around the world.

Being a part of this groundbreaking, not far from where my father worked at a coal plant years ago, humbles and inspires me. My work at Google brought me home to Alabama, and now Google can call Alabama home, too.

Meeting our match: Buying 100 percent renewable energy

A little over a year ago, we announced that we were on track to purchase enough renewable energy to match all the electricity we consumed over the next year. We just completed the accounting for Google’s 2017 energy use and it’s official—we met our goal. Google’s total purchase of energy from sources like wind and solar exceeded the amount of electricity used by our operations around the world, including offices and data centers.


What do we mean by “matching” renewable energy? Over the course of 2017, across the globe, for every kilowatt hour of electricity we consumed, we purchased a kilowatt hour of renewable energy from a wind or solar farm that was built specifically for Google. This makes us the first public Cloud, and company of our size, to have achieved this feat.


Today, we have contracts to purchase three gigawatts (3GW) of output from renewable energy projects; no corporate purchaser buys more renewable energy than we do. To date, our renewable energy contracts have led to over $3 billion in new capital investment around the world.

The road to 100 percent

We've been working toward this goal for a long time. At the outset of last year, we felt confident that 2017 was the year we'd meet it. Every year, we sign contracts for new renewable energy generation projects in markets where we have operations. From the time we sign a contract, it takes one to two years to build the wind farm or solar field before it begins producing energy. In 2016, our operational projects produced enough renewables to cover 57 percent of the energy we used from global utilities. That same year, we signed a record number of new contracts for wind and solar developments that were still under construction. Those projects began operating in 2017—and that additional output of renewable energy was enough to cover more than 100 percent of what we used during the whole year.


We say that we “matched” our energy usage because it’s not yet possible to “power” a company of our scale by 100 percent renewable energy. It’s true that for every kilowatt-hour of energy we consume, we add a matching kilowatt-hour of renewable energy to a power grid somewhere. But that renewable energy may be produced in a different place, or at a different time, from where we’re running our data centers and offices. What’s important to us is that we are adding new clean energy sources to the electrical system, and that we’re buying that renewable energy in the same amount as what we’re consuming, globally and on an annual basis.

Google's data center in Eemshaven, The Netherlands.
Google's data center in Eemshaven, The Netherlands.

Looking ahead

We’re building new data centers and offices, and as demand for Google products grows, so does our electricity load. We need to be constantly adding renewables to our portfolio to keep up. So we’ll keep signing contracts to buy more renewable energy. And in those regions where we can’t yet buy renewables, we’ll keep working on ways to help open the market. We also think every energy buyer—individuals and businesses alike—should be able to choose clean energy. We’re working with groups like the Renewable Energy Buyers Alliance and Re-Source Platform to facilitate greater access to renewably-sourced energy.


This program has always been a first step for us, but it is an important milestone in our race to a carbon-free future. We do want to get to a point where renewables and other carbon-free energy sources actually power our operations every hour of every day. It will take a combination of technology, policy and new deal structures to get there, but we're excited for the challenge. We can’t wait to get back to work.

Source: Google Cloud


GCP grows in the Netherlands region



When we launched the Netherlands region earlier this year, we said the third zone would be along shortly. We opened up the region with two zones as soon as we could to fulfill the growing demand from our customers in Benelux. Now, we’re happy to announce the launch of a third zone (europe-west4-a) in the region. This is the 45th GCP zone globally and now, like other GCP regions, this third zone enables developers to build highly available services that meet the needs of their business.

Services


The third zone includes all standard GCP services and we’re announcing the availability of the following new services in the region: Cloud Spanner, Cloud Bigtable, Managed Instance Groups, and Cloud SQL.

Partners in Benelux


We’ve got partners in Benelux ready to assist customers with design, deployment, migration and maintenance needs. Partners include: Rackspace, Xebia, ML6, PWC, Accenture, incentro, qlouder, fourcast, godatadriven and g-company.

As an official training partner, g-company is dedicated to helping companies organize their processes more intelligently by offering highly interactive Google Cloud Platform (GCP) training and supporting companies as they build tailor-made applications on GCP to transform their businesses. g-company led the implementation of the Netherlands-based online travel company, Travix, in a company-wide migration to Google Cloud.

Resources


For the latest on availability of services from this region as well as additional regions and services, visit our locations page. For guidance on how to build and create highly available applications, take a look at our zones and regions page. Watch this webinar to learn more about how we bring GCP closer to you. Give us a shout to request early access to new regions and help us prioritize what we build next.

We’re excited to see what you’ll build next on GCP!