Tag Archives: Announcements

5 must-see network sessions at Google Cloud NEXT 2018

Whether you’re moving data to or from Google Cloud, or are knee-deep plumbing your cloud network architecture, there’s a lot to learn at Google Cloud Next 2018 next week in San Francisco). Here’s our shortlist of the five must-see networking breakout sessions at the show, in chronological order from Wednesday to Thursday.
Operations engineer, Rebekah Roediger, delivering cloud network capacity one link at a time, in our Netherlands cloud region (europe-west4).

GCP Network and Security Telemetry
Speakers: Ines Envid, Senior Product Manager, Yuri Solodkin, Staff Software Engineer and Vineet Bhan, Head of Security Partnerships
Network and security telemetry is fundamental to operate your deployments in public clouds with confidence, providing the required visibility on the behavior of your network and access control firewalls.
When: July 24th, 2018 12:35pm

A Year in GCP Networking
Speakers: Srinath Padmanabhan, Networking Product Marketing Manager, Google Cloud and Nick Jacques, Lead Cloud Engineer, Target
In this session, we will talk about the valuable advancements that have been made in GCP Networking over the last year. We will introduce you to the GCP Network team and will tell you about what you can do to extract the most value from your GCP Deployment.
When: July 24th, 2018 1:55pm

Cloud Load Balancing Deep Dive and Best Practices
Speakers: Prajakta Joshi, Sr. Product Manager and Mike Columbus, Networking Specialist Team Manager
Google Cloud Load Balancing lets enterprises and cloud-native companies deliver highly available, scalable, low-latency cloud services with a global footprint. You will see demos and learn how enterprise customers deploy Cloud Load Balancing and the best practices they use to deliver smart, secure, modern services across the globe.
When: July 25th, 2018 12:35pm

Hybrid Connectivity - Reliably Extending Your Enterprise Network to GCP
Speaker: John Veizades, Product Manager, Google Cloud
In this session, you will learn how to connect to GCP with highly reliable and secure networking to support extending your data center networks into the cloud. We will cover details of resilient routing techniques, access to Google API from on premise networks, connection locations, and partners that support connectivity to GCP -- all designed to support mission-critical network connectivity to GCP.
When: July 26th, 2018 11:40am

VPC Deep Dive and Best Practices
Speakers: Emanuele Mazza, Networking Product Specialist, Google, Neha Pattan, Software Engineer, Google and Kamal Congevaram Muralidharan, Senior Member Technical Staff, Paypal
This session will walk you through the unique operational advantages of GCP VPC for your enterprise cloud deployments. We’ll go through detailed use cases, how to seal and audit your VPC, how to extend your VPC to on-prem in hybrid scenarios, and how to deploy highly available services.
When: July 26th, 2018 9:00am

Be sure to reserve your spot in these sessions today—space is filling up!

Kubernetes wins OSCON Most Impact Award

Today at the Open Source Awards at OSCON 2018, Kubernetes won the inaugural Most Impact Award, which recognizes a project that has had a ‘significant impact on how software is being written and applications are built’ in the past year. Thank you O’Reilly OSCON for the recognition, and more importantly, thank you to the vast Kubernetes community that has driven the project to where it is today.

When we released Kubernetes just four years ago, we never quite imagined how successful the project would be. We designed Kubernetes from a decade of experience running production workloads at Google, but we didn’t know whether the outside world would adopt it. However we believed that if we remained open to new ideas and new voices, the community would provide feedback and contributions to move the project forward to meet the needs of users everywhere.

This openness led to Kubernetes’ rapid adoption—and it’s also one of the core pillars of Google Cloud: our belief in an open cloud, so that you can pick-up and move your app wherever you want. Whether it’s Tensorflow, an open source library for machine learning, Asylo, a framework for confidential computing, or Istio, an open platform to connect microservices, openness remains a core value here at Google Cloud.

To everyone who has helped make Kubernetes the success it is today, many thanks again.

If you haven’t tried Kubernetes, it’s easy to get started with using Google Kubernetes Engine. If you’re interested to learn more about Kubernetes and the ecosystem it spawned, then subscribe to the Kubernetes Podcast from Google to hear weekly insights from leaders in the community.

VMware and Google Cloud: building the hybrid cloud together with vRealize Orchestrator

Many of our customers with hybrid cloud environments rely on VMware software on-premises. They want to simplify provisioning and enable end-user self service. At the same time, they also want to make sure they’re complying with IT policies and following IT best practices. As a result, many use VMware vRealize Automation, a platform for automated self-service provisioning and lifecycle management of IT infrastructure, and are looking for ways to leverage it in the cloud.

Today, we’re announcing the preview of our plug-in for VMware vRealize Orchestrator and support for Google Cloud Platform (GCP) resources in vRealize Automation. With these resources, you can now deploy and manage GCPresources from within your vRealize Automation environment.

The GCP plug-in for VMware vRealize Orchestrator provides a consistent management and governance experience across on-premises and GCP-based IT environments. For example, you can use Google-provided blueprints or build your own blueprints for Google Compute Engine resources and publish to the vRealize service catalog. This means you can select and launch resources in a predictable manner that is similar to how you launch VMs in your on-premises VMware environment, using a tool you’re already familiar with.

This preview release allows you to:
  • Create vRealize Automation “blueprints” for Compute Engine VM Instances
  • Request and self-provision resources in GCP using vRA’s catalog feature
  • Gain visibility and reclaim resources in GCP to reduce operational costs
  • Enforce access and resource quota policies for resources in GCP
  • Initiate Day 2 operations (start, stop, delete, etc.) on Compute Engine VM Instances, Instance Groups and Disks
The GCP plug-in for vRealize makes it easy for you to unlock new hybrid scenarios. For example:

  1. Reach new regions to address global business needs. (Hello Finland, Mumbai and Singapore.)
  2. Define large-scale applications using vRA and deploy to Compute Engine to leverage GCP’s worldwide load balancing and automatic scaling.
  3. Save money by deploying VMs as Compute Engine Preemptible VM Instances and using Custom Machine Types to tailor the VM configuration to application needs.
  4. Accelerate the time it takes to train a machine learning model by using Compute Engine with NVIDIA® Tesla® P100 GPUs.
  5. Replicate your on premises-based applications to the cloud and scale up or down as your business dictates.
While this preview offers support for Compute Engine Virtual Machines in vRealize Automation, we’re working together with VMware to add support for additional GCP products such as Cloud TPUs—we’ll share more on that in the coming months. You can also find more information about this announcement by reading VMware’s blog.

In the meantime, to join the preview program, please submit a request using the preview intake form.

Bringing GPU-accelerated analytics to GCP Marketplace with MapD

Editor’s note: Today, we hear from our partner MapD, whose data analytics platform uses GPUs to accelerate queries and visualizations. Read on to learn how MapD and Google Cloud are working together.

MapD and public cloud are a great fit. Combining cloud-based GPU infrastructure with MapD’s performance, interactivity and operational ease of use is a big win for our customers, allowing data scientists and analysts to visually explore billion-row datasets with fluidity and minimal hassle.

Our Community and Enterprise Edition images are available on AWS, MapD docker containers are available on NVIDIA GPU Cloud (NGC), as well as our own MapD Cloud. Today, we’re thrilled to announce the availability of MapD on Google Cloud Platform (GCP) Marketplace, helping us bring interactivity at scale to the widest possible audience. With services like Cloud DataFlow, Cloud BigTable and Cloud AI, GCP has emerged as a great platform for data-intensive workloads. Combining MapD and these services let us define scalable, high-performance visual analytics workflows for a variety of use cases.

On GCP, you’ll find both our Community and Enterprise editions for K80, Pascal and Volta GPU instances in the GCP Marketplace. Google’s flexible approach to attaching GPU dies to standard CPU-based instance types means you can dial up or down the necessary GPU capacity for your instances depending on the size of your datasets and your compute needs.

We’re confident that MapD’s availability on GCP marketplace will further accelerate the adoption of GPUs as a key part of enterprise analytics workloads, in addition to their obvious applicability to AI, graphics and general purpose computing. Click here to try out MapD on GCP.

Improving our account management policies to better support customers

Recently, a Google Cloud Platform (GCP) customer blogged about an incident in June, in which a project they were running on Google Cloud Platform was suspended. We really appreciated the candid feedback, in which our customer noted several aspects of our account management process which needed to be improved. We also appreciated our customer’s followup and recognition of the Google Cloud support team, “who have reached out and assured us these incidents will not repeat.”

Here’s what we are doing to be as good as our word, and provide a more careful, accurate, thoughtful and empathetic account management experience for our GCP customers. These changes are intended to provide peace of mind and a predictable, positive experience for our GCP customers, while continuing to permit appropriate suspension and removal actions for the inevitable bad actors and fraud which are a part of operating a public cloud service.

No Automatic Fraud-Based Account Suspensions for Established Customers with Offline Payment. Established GCP customers complying with our Acceptable Use Policy (AUP, TOS and local laws), with an offline billing contract, invoice billing, or an active relationship with our sales team, are not subject to fraud-based automatic account suspension.

Delayed Suspension for Established Online Customers. Online customers with established payment history, operating in compliance with our TOS, AUP and local laws, will receive advance notification and a 5 day cure period in the event that fraud or account compromise activity is detected in their projects.

Other Customers. For all other customers, we will institute a second human review for flagged fraud accounts prior to suspending an account. We’re also modifying who has authority to suspend an account, as well as refreshing our training for the teams that review flagged accounts and determine response actions; re-evaluating the signals, sources, and the tools we use to assess potential fraudulent activity; and increasing the number of options we can use to help new customers quickly and safely grow their usage while building an account history with Google.

In addition to the above, for all customers we are making the following improvements:

24X7 Chat Support. We are rolling out 24X7 chat support for customers that receive account notices, so that customers can always reach us easily. We expect this to be fully rolled out for all customers by September.

Correcting Notices About our 30 Day Policy. Our customer noted, with appropriate concern, that their suspension email stated “we will delete your account in 3 days.” This language was simply incorrect -- our fraud suspension policy provides 30 days before removal. We have corrected the communication language, and we are conducting a full review of our communication verbiage and systems and ensuring that our messages are accurate and clear.

Updating Our Project Suspension Guidelines. We will review and update our project suspension guidelines to clarify our practices and describe what you should expect from Google.

Improving Customer Contact Points. We will encourage customers to provide us with a verifiable phone number, email, and other contact channels, both at sign-up and at later points in time, so that we can quickly contact you if we detect suspicious activity on your account.

Creating Customer Pre-Verification. We will provide ways for customers to pre-verify their accounts with us if they desire, either at sign-up or at a later point in time.

These suspensions are our responsibility.There are also steps that customers can take to help us protect their accounts including:
  1. Make sure to monitor emails sent to your payments and billing contacts so you don’t miss important alerts.
  2. Provide a valid phone number where we can reach you in the event of suspicious activity on your account.
  3. Provide one or more billing admins to your account.
  4. Provide a secondary payment method in case there are problems charging your primary method.
  5. Contact our sales team to see if you qualify for invoice billing instead of relying on credit cards.
We’re making immediate changes to ensure our policies will improve our customer’s experience. Our work here is never done and we will continue to update and optimize based on your feedback.

We sincerely apologize to all our customers who’ve been concerned or had to go through a service reinstatement. Please keep the feedback coming, we’ll work to continue to earn your trust every day.

Introducing commercial Kubernetes applications in GCP Marketplace

Building, deploying and managing applications with Kubernetes comes with its own set of unique challenges. Today, we are excited to be the first major cloud provider to offer production-ready commercial Kubernetes apps right from our marketplace, bringing you simplified deployment, billing, and third-party licensing.

Now you can find the solution you need in Google Cloud Platform Marketplace (formerly Cloud Launcher) and deploy quickly on Kubernetes clusters running on Google Cloud Platform (GCP), Kubernetes Engine, on-prem, or even other public clouds.

Enterprise-ready containerized applications - We are on a mission to make containers accessible to everyone, especially the enterprise. When we released Kubernetes as open source, one of the first challenges that the industry tackled was management. Our hosted Kubernetes Engine takes care of cluster orchestration and management, but getting apps running on a Kubernetes cluster can still be a manual, time-consuming process. With GCP Marketplace, you can now easily find prepackaged apps and deploy them onto the cluster of your choice.

Simplified deployments - Kubernetes apps are configured to get up and running fast. Enjoy click-to-deploy to Kubernetes Engine, or deploy them to other Kubernetes clusters off-GCP. Now, deploying from Kubernetes Engine is even easier, with a Marketplace window directly in the Kubernetes Engine console.

Production-ready security and reliability - All Kubernetes apps listed on GCP Marketplace are tested and vetted by Google, including vulnerability scanning and partner agreements for maintenance and support. Additionally, we work with open-source Special Interest Groups (SIGs) to create standards for Kubernetes apps, bringing the knowledge of the open-source community to your enterprise.

Supporting hybrid environments - One of the great things about containers is their portability across environments. While Kubernetes Engine makes it easy to click-to-deploy these apps, you can also deploy them in your other Kubernetes clusters—even if they’re on-premises. This lets you use the cloud for development and then move your workloads to your production environment, wherever it may be.

Commercial Kubernetes applications available now

Our commercial Kubernetes apps, developed by third-party partners, support usage-based billing on many parameters (API calls, number of hosts, storage per month), simplifying license usage and giving you more consumption options. Further, the usage charges for your apps are consolidated and billed through GCP, no matter where they are deployed (not including any non-GCP resources they need to run on).

“Cloud deployment and manageability are core to Aerospike's strategy. GCP Marketplace makes it simpler for our customers to buy, deploy and manage Aerospike through Kubernetes Engine with one-click deployment. This provides a seamless experience for customers by allowing them to procure both Aerospike solutions and Kubernetes Engine on a single, unified Google bill and providing them with the flexibility to pay as they go.”
- Bharath Yadla, VP-Product Strategy, EcoSystems, Aerospike

"As an organization focused on supporting enterprises with security for their container-based applications, we are delighted that we can now offer our solutions as commercial Kubernetes application more simply to customers through the GCP Marketplace commercial Kubernetes application option. GCP Marketplace helps us reach GCP customers, and the one-click deployment of our applications to Google Kubernetes Engine makes it easier for enterprises to use our solution. We are also excited about GCP’s commitment to enterprise agility by allowing our solution to be deployed on-premises, letting us reach enterprises where they are today."
- Upesh Patel, VP Business Development, Aqua Security

“Couchbase is excited to see GCP Marketplace continue the legacy of GCP by bringing new technologies to market. We've seen GCP Marketplace as a key part of our strategy in reaching customers, and the new commercial Kubernetes application option differentiates us as innovators for both prospects and customers."
-Matt McDonough, VP of Business Development, Couchbase

"With the support for commercial Kubernetes applications, GCP Marketplace allows us to reach a wider range of customers looking to deploy our graph database both to Google Kubernetes Engine and hybrid environments. We're excited to announce our new offering on GCP Marketplace as a testament to both Neo4j and Google's innovation in integrations to Kubernetes."
- David Allen, Partner Solution Architect, Neo4j

Popular open-source Kubernetes apps available now

In addition to our new commercial offerings, GCP Marketplace already features popular open-source projects that are ready to deploy into Kubernetes. These apps are packaged and maintained by Google Cloud and implement best practices for running on Kubernetes Engine and GCP. Each app includes clustered images and documented upgrade steps, so it’s ready to run in production.

One-stop shopping on GCP Marketplace

As you may have noticed, Google Cloud Launcher has been renamed to GCP Marketplace, a more intuitive name for the place to discover the latest partner and open source solutions. Like Kubernetes apps, we test and vet all solutions available through the GCP Marketplace, which include virtual machines, managed services, data sets, APIs, SaaS, and more. In most instances, we also recommend Marketplace solutions for your projects.
With GCP Marketplace, you can verify that a solution will work for your environment with free trials from select partners. You can also combine those free trials with our $300 sign-up credit. Once you’re up and running, GCP Marketplace supports existing relationships between you and your partners with private pricing. Private pricing is currently available for managed services, and support for more solution types will be rolling out in the coming months.

Get started today

We’re excited to bring support for Kubernetes apps to you and our partners, featuring the extensibility of Kubernetes, commercial solutions, usage-based pricing, and discoverability on the newly revamped GCP Marketplace.
If you are a partner and want to learn more about selling your solution on GCP Marketplace, please visit our sign-up page.

Top storage and database sessions to check out at Next 2018

Whatever your particular area of cloud interest, there will be a lot to learn at Google Cloud Next ‘18 (July 24-26 in San Francisco). When it comes to cloud storage and databases, you’ll find useful sessions that can help you better understand your options as you’re building the cloud infrastructure that will work best for your organization.

Here, we’ve chosen five not-to-miss sessions, where you’ll learn tips on migrating data to the cloud, understand types of cloud storage workloads and get a closer look at which database is best for storing and analyzing your company’s data. Wherever you are in your cloud journey, there’s likely a session you can use.

Top cloud storage sessions

First up, our top picks for those of you delving into cloud storage.

From Blobs to Tables, Where to Store Your Data
Speakers: Dave Nettleton, Robert Saxby

What’s the best way to store all the data you’re creating and moving to the cloud? The answer depends on the industry, apps and users you’re supporting. Google Cloud Platform (GCP) offers many options for storing your data. The choices range from Cloud Storage (multi-regional, regional, nearline, coldline) through Persistent Disk to various database services (Cloud Datastore, Cloud SQL, Cloud Bigtable, Cloud Spanner) and data warehousing (BigQuery). In this session, you’ll learn about the products along with common application patterns that use data storage.

Why attend: With much to consider and many options available, this session is a great opportunity to examine which storage option fits your workloads.

Caching Made Easy, with Cloud Memorystore and Redis
Speaker: Gopal Ashok

In-memory database Redis has plenty of developer fans: It’s high-performance and highly available, making it an excellent choice for caching operations. Cloud Memorystore now includes a managed Redis service. In this session, you’ll hear about its new features. You’ll also learn how you can easily migrate applications using Redis to Cloud Memorystore with minimal changes.
Why attend: Are you building an application that needs sub-millisecond response? GCP provides fully managed service for the popular Redis in-memory datastore.

Google Cloud Storage - Best Practices for Storage Classes, Reliability, Performance and Scalability
Speakers: Geoff Noer, Michael Yu

Learn about common Google Cloud Storage workloads, such as content storage and serving, analytics/ML and data protection. Understand how to choose the best storage class, depending on what kind of data you have and what kind of workload you're supporting. You’ll also learn more about Multi-Regional, Regional, Nearline and Coldline storage.
Why attend: You’ll learn about ways to optimize Cloud Storage to the unique requirements of different storage use cases.

Top database sessions

Here are our top picks for database sessions to explore at Next ‘18.

Optimizing Applications, Schemas, and Query Design on Cloud Spanner
Speaker: Robert Kubis

Cloud Spanner was designed specifically for cloud infrastructure and scales easily to allow for efficient cloud growth. In this session, you’ll learn Cloud Spanner best practices, strategies for optimizing applications and workloads, and ways to improve performance and scalability. Through live demos, you’ll see real-time speed-ups of transactions, queries and overall performance. Additionally, this talk explores techniques for monitoring Cloud Spanner to identify performance bottlenecks. Come learn how to cut costs and maximize performance with Cloud Spanner.
Why attend: Cloud Spanner is a powerful product, but many users do not maximize its benefits. You’ll get an inside look at this session at getting the best performance and efficiency results out of this type of cloud database.

Optimizing performance on Cloud SQL for MySQL
Speakers: Stanley Feng, Theodore Tso, Brett Hesterberg

Database performance tuning can be challenging and time-consuming. In this session, you’ll get a look at the performance tuning our team has conducted in the last year to considerably improve Cloud SQL for MySQL. We’ll also highlight useful changes to the Linux kernel, EXT4 filesystem and Google's Persistent Disk storage layer to improve write performance. You'll come away knowing more about MySQL performance tuning, an underused EXT4 feature called “bigalloc” and how to let Cloud SQL handle mundane, yet necessary, tasks so you can focus on developing your next great app.
Why attend: When GCP provides fully managed services for databases, we put lots of innovations under the hood, so that your database runs in the most optimal way. Come and learn about Google’s secret sauce that lets you optimize Cloud SQL performance.

Check out the full list of Next sessions, and join your peers at the show by registering here.

Cloud Spanner adds import/export functionality to ease data movement

We launched Cloud Spanner to general availability last year, and many of you shared in our excitement: You explored it, started proof-of-concept trials, and deployed apps. Perhaps most importantly, you gave us feedback along the way. We heard you, and we got to work. Today, we’re happy to announce we’ve launched one of your most commonly requested features: importing and exporting data.

Import/export using Avro

You asked for easier ways to move data. You’ve got it. You can now import and export data easily in the Cloud Spanner Console:
  • Export any Cloud Spanner database into a Google Cloud Storage (GCS) bucket.
  • Import files from a GCS bucket into a new Cloud Spanner database.
These database exports and imports use Apache Avro files, transferred with our recently released Apache Beam-based Cloud Dataflow connector.

Adding imports and exports opens up even more possibilities for your Cloud Spanner data, including:
  • Disaster recovery: Export your database at any time and store it in a GCS location of your choice as a backup, which can be imported into a new Cloud Spanner database to restore data.
  • Testing: Export a database and then import it into Cloud Spanner as a dev/test database to use for integration tests or other experiments.
  • Moving databases: Export a database and import it back into Cloud Spanner in a new/different instance with the console’s simple, push-button functionality.
  • Ingest for analytics: Use database exports to ingest your operational data to other services such as BigQuery, for analytics. BigQuery can automatically ingest data in Avro format from a GCS bucket, which means it will become easier for you to run analytics on your operational data.
Ready to try it out? See our documentation on how to import and export data. Learn more about Cloud Spanner here, and get started with a free trial. For technical support and sales, please contact us.

We're excited to see the ways that Cloud Spanner—making application development more efficient, simplifying database administration and management, and providing the benefits of both relational and scale-out, non-relational databases—will continue to help you ship better apps, faster.

Our Los Angeles cloud region is open for business

Hey, LA — the day has arrived! The Los Angeles Google Cloud Platform region is officially open for business. You can now store data and build highly available, performant applications in Southern California.

Los Angeles Mayor Eric Garcetti said it best: “Los Angeles is a global hub for fashion, music, entertainment, aerospace, and more—and technology is essential to strengthening our status as a center of invention and creativity. We are excited that Google Cloud has chosen Los Angeles to provide infrastructure and technology solutions to our businesses and entrepreneurs.”

The LA cloud region, us-west2, is our seventeenth overall and our fifth in the United States.

Hosting applications in the new region can significantly improve latency for end users in Southern California, and by up to 80% across Northern California and the Southwest, compared to hosting them in the previously closest region, Oregon. You can visit www.gcping.com to see how fast the LA region is for you.


The LA region has everything you need to build the next great application:

Of note, the LA region debuted with one of our newest products: Cloud FilestoreBETA, our managed file storage service for applications that require a filesystem interface and a shared filesystem for data.

The region also has three zones, allowing you to distribute apps and storage across multiple zones to protect against service disruptions. You can also access our multi-regional services (such as BigQuery) in the United States and all the other GCP services via our Google Network, and combine any of the services you deploy in LA with other GCP services around the world. Please visit our Service Specific Terms for detailed information on our data storage capabilities.

Google Cloud Network

Google Cloud’s global networking infrastructure is the largest cloud network as measured by number of points of presence. This private network provides a high-bandwidth, highly reliable, low-latency link to each region across the world. With it, you can reach the LA region as easily as any region. In addition, the global Google Cloud Load Balancing makes it easy to deploy truly global applications.

Also, if you’d like to connect to the Los Angeles region privately, we offer Dedicated Interconnect at two locations: Equinix LA1 and CoreSite LA1.

LA region celebration

We celebrated the launch of the LA cloud region the best way we know how: with our customers. At the celebration, we announced new services to help content creators take advantage of the cloud: Filestore, Transfer Appliance and of course, the new region itself, in the heart of media and entertainment country. The region’s proximity to content creators is critical for cloud-based visual effects and animation workloads. With proximity comes low latency, which lets you treat the cloud as if it were part of your on-premises infrastructure—or even migrate your entire studio to the cloud.
Paul-Henri Ferrand, President of Global Customer Operations, officially announces the opening of our Los Angeles cloud region.

What customers are saying

“Google Cloud makes the City of Los Angeles run more smoothly and efficiently to better serve Angelenos city-wide. We are very excited to have a cloud region of our own that enables businesses, big or small, to leverage the latest cloud technology and foster innovation.”
- Ted Ross, General Manager and Chief Information Officer for City of LA Information Technology Agency, City of LA

“Using Google Cloud for visual effects rendering enables our team to be fast, flexible and to work on multiple large projects simultaneously without fear of resource starvation. Cloud is at the heart of our IT strategy and Google provides us with the rendering power to create Oscar-winning graphics in post-production work.”
- Steve MacPherson, Chief Technology Officer, Framestore

“A lot of our short form projects pop up unexpectedly, so having extra capacity in region can help us quickly capitalize on these opportunities. The extra speed the LA region gives us will help us free up our artists to do more creative work. We’re also expanding internationally, and hiring more artists abroad, and we’ve found that Google Cloud has the best combination of global reach, high performance and cost to help us achieve our ambitions.”
- Tom Taylor, Head of Engineering, The Mill

What SoCal partners are saying

Our partners are available to help design and support your deployment, migration and maintenance needs.

“Cloud and data are the new equalizers, transforming the way organizations are built, work and create value. Our premier partnership with Google Cloud Platform enables us to help our clients digitally transform through efforts like app modernization, data analytics, ML and AI. Google’s new LA cloud region will enhance the deliverability of these solutions and help us better service the LA and Orange County markets - a destination where Neudesic has chosen to place its corporate home.”
- Tim Marshall, CTO and Co-Founder, Neudesic

“Enterprises everywhere are on a journey to harness the power of cloud to accelerate business objectives, implement disruptive features, and drive down costs. The Taos and Google Cloud partnership helps companies innovate and scale, and we are excited for the new Google Cloud LA region. The data center will bring a whole new level of uptime and service to our Southern California team and clients.”
- Hamilton Yu, President and COO, Taos

“As a launch partner for Google Cloud and multi-year recipient of Google’s Partner of the Year award, we are thrilled to have Google’s new cloud region in Los Angeles, our home base and where we have a strong customer footprint. SADA Systems has a track record of delivering industry expertise and innovative technical services to customers nationwide. We are excited to leverage the scale and power of Google Cloud along with SADA’s expertise for our clients in the Los Angeles area to continue their cloud transformation journey.”
- Tony Safoian, CEO & President, SADA Systems

Getting started

For additional details on the LA region, please visit our LA region page where you’ll get access to free resources, whitepapers, the "Cloud On-Air" on-demand video series and more. Our locations page provides updates on the availability of additional services and regions. Contact us to request early access to new regions and help us prioritize where we build next.

Introducing new Apigee capabilities to deliver business impact with APIs

Whether it's delivering new experiences through mobile apps, building a platform to power a partner ecosystem, or modernizing IT systems, virtually every modern business uses APIs (application programming interfaces).

Google Cloud’s Apigee API platform helps enterprises adapt by giving them control and visibility into the APIs that connect applications and data across the enterprise and across clouds. It enables organizations to deliver connected experiences, create operational efficiencies, and unlock the power of their data.

As enterprise API programs gain traction, organizations are looking to ensure that they can seamlessly connect data and applications, across multi-cloud and hybrid environments, with secure, manageable and monetizable APIs. They also need to empower developers to quickly build and deliver API products and applications that give customers, partners, and employees secure, seamless experiences.

We are making several announcements today to help enterprises do just that. Thanks to a new partnership with Informatica, a leading integration-platform-as-a-service (iPaaS) provider, we’re making it easier to connect and orchestrate data services and applications, across cloud and on-premise environments, using Informatica Integration Cloud for Apigee. We’ve also made it easier for API developers to access Google Cloud services via the Apigee Edge platform.

Discover and invoke business integration processes with Apigee

We believe that for an enterprise to accelerate digital transformation, it needs API developers to focus on business-impacting programs rather than low-level tasks such as coding, rebuilding point-to-point integrations, and managing secrets and keys.

From the Apigee Edge user interface, developers can now use policies to discover and invoke business integration processes that are defined in Informatica’s Integration Cloud.

Using this feature, an API developer can add a callout policy inside an API proxy that invokes the required Informatica business integration process. This is especially useful when the business integration process needs to be invoked before the request gets routed to the configured backend target.

To use this feature, API developers:
  • Log in to Apigee Edge user interface with their credentials
  • Create a new API proxy, configure backend target, add policies
  • Add a callout policy to select the appropriate business integration process
  • Save and deploy the API proxy

Access Google Cloud services from the Apigee Edge user interface

API developers want to easily access and connect with Google Cloud services like Cloud Firestore, Cloud Pub/Sub, Cloud Storage, and Cloud Spanner. In each case, there are a few steps to perform to deal with security, data formats, request/response transformation, and even wire protocols for those systems.

Apigee Edge includes a new feature that simplifies interacting with these services and enables connectivity to them through a first-class policy interface that an API developer can simply pick from the policy palette and use. Once configured, these can be reused across all API proxies.

We’re working to expand this feature to cover more Google Cloud services. Simultaneously, we’re working with Informatica to include connections to other software-as-a-service (SaaS) applications and legacy services like hosted databases.

Publish business integration processes as managed APIs

Integration architects, working to connect data and applications across the enterprise, play an important role in packaging and publishing business integration processes as great API products. Working with Informatica, we’ve made this possible within Informatica’s Integration Cloud.

Integration architects that use Informatica's Integration Cloud for Apigee can now author composite services using business integration processes to orchestrate data services and applications, and directly publish them as managed APIs to Apigee Edge. This pattern is useful when the final destination of the API call is an Informatica business integration process.

To use this feature, integration architects need to execute the following steps:
  • Log in to their Informatica Integration Cloud user interface
  • Create a new business integration process or modify an existing one
  • Create a new service of type (“Apigee”), select options (policies) presented on the wizard, and publish the process as an API proxy
  • Apply additional policies to the generated API proxy by logging in to the Apigee Edge user interface.
API documentation can be generated and published on a developer portal, and the API endpoint can be shared with app developers and partners. APIs are an increasingly central part of organizations’ digital strategy. By working with Informatica, we hope to make APIs even more powerful and pervasive. Click here for more on our partnership with Informatica.