Tag Archives: Announcements

Powering ads and analytics innovations with machine learning

This post originally appeared on the Inside AdWords blog.

Good morning, San Francisco! As the city starts to wake up, my team and I are gearing up to welcome over a thousand marketers from around the world to Google Marketing Next, our annual event where we unveil the latest innovations for ads, analytics and DoubleClick.

A big theme you’ll hear about today is machine learning. This technology is critical to helping marketers analyze countless signals in real time and reach consumers with more useful ads at the right moments. Machine learning is also key to measuring the consumer journeys that now span multiple devices and channels across both the digital and physical worlds.

It's a growing and important trend for marketers today, and will continue to shape how you build for success in the future.

Below is a sneak preview of a few of the announcements I’ll be making. There are many more that I can’t wait to share with you. Be sure to tune in at 9:00 a.m. PT/12:00 p.m. ET.


Hello Google Attribution, goodbye last-click

Today, we're announcing Google Attribution, a new product to answer the question that has challenged marketers for ages, “Is my marketing working?” For the first time, Google Attribution makes it possible for every marketer to measure the impact of their marketing across devices and across channels -- all in one place, and at no additional cost.

With today’s complex customer journey, your business might have a dozen interactions with a single person - across display, video, search, social, and on your site or app. And all these moments take place on multiple devices, making them even harder to measure.

Marketers have been trying to make attribution work for years, but existing solutions just don't cut it. Most attribution tools:

  • Are hard to set up
  • Lose track of the customer journey when people move between devices
  • Aren’t integrated with ad tools, making it difficult to take action
As a result, many marketers are stuck using last-click attribution, which misses the impact of most marketing touchpoints. With Google Attribution, we’ll help you understand how all of your marketing efforts work together and deliver the insights you need to make them work better.

Here’s how it works:
Integrations with AdWords, Google Analytics and DoubleClick Search make it easy to bring together data from all your marketing channels. The end result is a complete view of your performance.
Google Attribution also makes it easy to switch to data-driven attribution. Data-driven attribution uses machine learning to determine how much credit to assign to each step in the consumer journey -- from the first time they engage with your brand for early research down to the final click before purchase. It analyzes your account's unique conversion patterns, comparing the paths of customers who convert to those who don’t, so you get results that accurately represent your business.

Finally, you can take fast action to optimize your ads with Google Attribution because it integrates with ads tools like AdWords and DoubleClick Search. The results are immediately available for reporting, updating bids or moving budget between channels.
“Given today’s multi-device landscape, cross-channel measurement and attribution is indispensable for HelloFresh to have a 360º panorama of our customer journey and gives us the best data to make the best decisions.” - Karl Villanueva, Head of Paid Search & Display 
Google Attribution is now in beta and will roll out to more advertisers over the coming months.

Mobile-local innovations drive more consumers to stores

Mobile has blurred the line between the digital and physical worlds. While most purchases still happen in-store, people are increasingly turning to their smartphones to do research beforehand -- especially on Google.com and Google Maps.
To help consumers decide where to go, marketers are using innovations like Promoted Places and local inventory ads to showcase special offers and what’s in-stock at nearby stores. Now, you can also make it easy for them to find a store from your YouTube video ads using location extensions.

We introduced store visits measurement back in 2014 to help marketers gain more insight about consumer journeys that start online and end in a store. In under three years, advertisers globally have measured over 5 billion store visits using AdWords.

Only Google has the advanced machine learning and mapping technology to help you accurately measure store visits at scale and use these insights to deliver better local ad experiences. Our recent upgrade to deep learning models enables us to train on larger data sets and measure more store visits in challenging scenarios with greater confidence. This includes visits that happen in multi-story malls or dense cities like Tokyo, Japan and São Paulo, Brazil where many business locations are situated close together. Store visits measurement is already available for Search, Shopping and Display campaigns. And soon this technology will be available for YouTube TrueView campaigns to help you measure the impact of video ads on foot traffic to your stores.

Still, measuring store visits is just one part of the equation. You also need insights into how your online ads drive sales for your business. You need to know: are my online ads ringing my cash register? In the coming months, we’ll be rolling out store sales measurement at the device and campaign levels. This will allow you to measure in-store revenue in addition to the store visits delivered by your Search and Shopping ads.

If you collect email information at the point of sale for your loyalty program, you can import store transactions directly into AdWords yourself or through a third-party data partner. And even if your business doesn’t have a large loyalty program, you can still measure store sales by taking advantage of Google’s third-party partnerships, which capture approximately 70% of credit and debit card transactions in the United States. There is no time-consuming setup or costly integrations required on your end. You also don’t need to share any customer information. After you opt in, we can automatically report on your store sales in AdWords.

Both solutions match transactions back to Google ads in a secure and privacy-safe way, and only report on aggregated and anonymized store sales to protect your customer data.

Virgin Holidays discovered that when it factors in store sales, its search campaigns generate double the profit compared to looking at online KPIs alone. A customer purchasing in-store after clicking on a search ad is also three times more profitable than an online conversion. Says James Libor, Performance Marketing and Technology Manager, “Store sales measurement gives us a more accurate view of the impact our digital investment has on in-store results, especially through mobile. This has empowered us to invest more budget in Search to better support this critical part of the consumer journey.”


Machine learning delivers more powerful audience insights to search ads

People are often searching with the intent to buy. That’s why we’re bringing in-market audiences to Search to help you reach users who are ready to purchase the products and services you offer. For example, if you’re a car dealership, you can increase your reach among users who have already searched for “SUVs with best gas mileage” and “spacious SUVs”. In-market audiences uses the power of machine learning to better understand purchase intent. It analyzes trillions of search queries and activity across millions of websites to help figure out when people are close to buying and surface ads that will be more relevant and interesting to them.

This is an important moment for marketers. The convergence of mobile, data and machine learning will unlock new opportunities for marketers -- and I’m excited to be on this journey with all of you.
Please join us at 9:00 a.m. PT/12:00 p.m. ET to see the entire keynote at Google Marketing Next, and all the other innovations we’re planning to announce for ads, analytics and DoubleClick.

Firebase Analytics Gets New Features and a Familiar New Name

Can it be just a year since we announced the expansion of Firebase to become Google's integrated app developer platform at I/O 2016? That Firebase launch came complete with brand new app analytics reporting and features, developed in conjunction with the Google Analytics team.

Now, at I/O 2017, we're delighted to announce some exciting new features and integrations that will help take our app analytics to the next level. But first, we’d like to highlight a bit of housekeeping. As of today, we are retiring the name Firebase Analytics. Going forward, all app analytics reports will fall under the Google Analytics brand.

This latest generation of app analytics has always, and will continue to be, available in both the Firebase console and in Google Analytics. We think that unifying app analytics under the Google Analytics banner will better communicate that our users are getting the same great app data in both places. In Firebase and related documentation, you'll see app analytics referred to as Google Analytics for Firebase. Read on to the end of this post for more details about this change.

One other note: The launches highlighted below apply to our latest generation of app analytics – you need to be using the Firebase SDK to get these new features.

Now let’s take a look at what’s new.

Integration with AdMob
App analytics is now fully integrated with AdMob. Revenue, impression and click data from AdMob can now be connected with the rest of your event data collected by the Firebase SDK, all of it available in the latest Google Analytics app reports and / or in the Firebase console.

For app companies, this means that ad revenue can be factored into analytics data, so Analytics reports can capture each app’s performance. The integration combines AdMob data with Analytics data at the event level to produce brand new metrics, and to facilitate deep dives into existing metrics. You can answer questions like:
  • What is the true lifetime value for a given segment, factoring in both ad revenue and purchase revenue?
  • How do rewarded ads impact user engagement and LTV?
  • On which screens are users being exposed to advertising the most or the least?
With this change, you can now have a complete picture of the most important metrics for your business ― all in one place.

Custom parameter reporting
"What's the average amount of time users spend in my game before they make their first purchase?" Many of you have asked us for the ability to report on specific data points like these that are important to your business.

Custom parameter reporting is here to make that possible. You can now register up to 50 custom event parameters and see their details in your Analytics reports.
  • If you supply numeric parameters you’ll see a graph of the average and the sum of that parameter.
  • If you supply textual parameters you’ll see a breakdown of the most popular values.
As with the rest of your Analytics reports, you can also apply Audience and User Property filters to your custom parameter reports to identify trends among different segments of your userbase.

To start using custom parameter reporting for one of your events, look for it in the detail report for that event. You'll see instructions for setting things up there.

Integration with DoubleClick and third-parties – Now in Beta
We're also pleased to announce a new integration with both DoubleClick Campaign Manager and DoubleClick Bid Manager. Firebase-tracked install (first open) and post-install events can now easily be imported back into DoubleClick as conversions.

This is a boost for app marketers who want a clearer view of the effect their display and video marketing has on customer app behavior. Advertisers can make better decisions (for all kinds of ads, programmatic included) as they integrate app analytics seamlessly with their buying, targeting and optimization choices in DoubleClick.

We also know that some of you use advertising platforms beyond AdWords and DoubleClick, so we continue to invest in integrating more third-party networks into our system. (We're now at 50 networks and growing). The goal: to allow app data from all your networks to come together in Google Analytics, so you can make even better advertising choices using all the data you collect. Learn more.

Real-time analytics for everyone
Google Analytics pioneered real-time reporting, so we know how important it is for our customers to have access to data as it happens. That’s why we’re so excited by the real-time capabilities we’ve introduced into our latest app reports. To refresh an announcement we made in March: StreamView and DebugView are now available to the general public. These features allow you to see how real-world users are interacting and performing with your app right now.

StreamView visualizes events as they flow into our app reporting to give you a sense of how people around the world are using your app, right down to the city level. Then Snapshot lets you zoom-into a randomly selected individual user’s stream of events. And DebugView uses real-time reporting to help you improve your implementation – making it easy for you to make sure you’re measuring what you want how you want. DebugView is a terrific tool for app builders that shows you events, parameters and user properties for any individual development device. It can also highlight any events that contain invalid parameters.

Same product, familiar new name
As mentioned above, we're rebranding Firebase Analytics to make it plain that it's our recommended app analytics solution, and is fully a part of the Google Analytics family.

Our latest reports represent a new approach to app analytics, which we believe better reflects the way that users interact with apps. This means that these reports have different concepts and functionality when compared to the original app analytics reports in Google Analytics.

If you're used to using the original app analytics reports in Google Analytics, don’t worry: they're not going anywhere. But we recommend considering implementing the Firebase SDK with your next app update so you can start getting the latest features for app analytics.

Good data is one thing everyone can agree on: developers and marketers, global firms and fresh new start-ups. We've always been committed to app-centric reports, because analytics and data are the essential beginning to any long-term app strategy. We hope that these new features will give you more of what you need to build a successful future for your own apps.

Google Analytics is Enhancing Support for AMP

Over the past year, developers have adopted the Accelerated Mobile Pages (AMP) technology to build faster-loading pages for all types of sites, ranging from news to recipes to e-commerce. Billions of AMP pages have been published to date and Google Analytics continues its commitment to supporting our customers who have adopted AMP.

However, we have heard feedback from Google Analytics customers around challenges in understanding the full customer journey due to site visitors being identified inconsistently across AMP and non-AMP pages. So we're announcing today that we are rolling out an enhancement that will give you an even more accurate understanding of how people are engaging with your business across AMP and non-AMP pages of your website.

How will this work?
This change brings consistency to users across AMP and non-AMP pages served from your domain. It will have the effect of improving user analysis going forward by unifying your users across the two page formats. It does not affect AMP pages served from the Google AMP Cache or any other AMP cache.

When will this happen?
We expect these improvements to be complete, across all Google Analytics accounts, over the next few weeks.

Are there any other implications of this change?
As we unify your AMP and non-AMP users when they visit your site in the future, you may see changes in your user and session counts, including changes to related metrics. User and session counts will go down over time as we recognize that two formerly distinct IDs are in fact the same user; however, at the time this change commences, the metric New Users may rise temporarily as IDs are reset.

In addition, metrics like time on site, page views per session, and bounce rate will rise consistent with sessions with AMP and non-AMP pageviews no longer being treated as multiple sessions. This is a one-time effect that will continue until all your users who have viewed AMP pages in the past are unified (this can take a short or long period of time depending on how quickly your users return to your site/app).

Is there anything I need to do to get this update?
There is no action required on your part, these changes will be automatically rolled out.

Will there be changes to unify users who view my pages both on my domain and in other contexts?
Some AMP pages are not visited directly on the domain where the content is originally hosted but instead via AMP caches or in platform experiences. However we decided to focus on fixing the publisher domain case first as this was the fastest way we could add value for our clients.

We are committed to ensuring the best quality data for user journey analysis across AMP and non-AMP pages alike and this change makes that easy for AMP pages served on your domain. We hope you enjoy these improvements - and as always, happy analyzing!

Introducing Google Cloud IoT Core: for securely connecting and managing IoT devices at scale



Today we're announcing a new fully-managed Google Cloud Platform (GCP) service called Google Cloud IoT Core. Cloud IoT Core makes it easy for you to securely connect your globally distributed devices to GCP, centrally manage them and build rich applications by integrating with our data analytics services. Furthermore, all data ingestion, scalability, availability and performance needs are automatically managed for you in GCP style.

When used as part of a broader Google Cloud IoT solution, Cloud IoT Core gives you access to new operational insights that can help your business react to, and optimize for, change in real time. This advantage has value across multiple industries; for example:
  • Utilities can monitor, analyze and predict consumer energy usage in real time
  • Transportation and logistics firms can proactively stage the right vehicles/vessels/aircraft in the right places at the right times
  • Oil and gas and manufacturing companies can enable intelligent scheduling of equipment maintenance to maximize production and minimize downtime

So, why is this the right time for Cloud IoT Core?


About all the things


Many enterprises that rely on industrial devices such as sensors, conveyor belts, farming equipment, medical equipment and pumps particularly, globally distributed ones are struggling to monitor and manage those devices for several reasons:
  • Operational cost and complexity: The overhead of managing the deployment, maintenance and upgrades for exponentially more devices is stifling. And even with a custom solution in place, the resource investments required for necessary IT infrastructure are significant.
  • Patchwork security: Ensuring world-class, end-to-end security for globally distributed devices is out of reach or at least not a core competency for most organizations.
  • Data fragmentation: Despite the fact that machine-generated data is now an important data source for making good business decisions, the massive amount of data generated by these devices is often stored in silos with a short expiration date, and hence never reaches downstream analytic systems (nor decision makers).
Cloud IoT Core is designed to help resolve these problems by removing risk, complexity and data silos from the device monitoring and management process. Instead, it offers you the ability to more securely connect and manage all your devices as a single global system. Through a single pane of glass you can ingest data generated by all those devices into a responsive data pipeline and, when combined with other Cloud IoT services, analyze and react to that data in real time.

Key features and benefits


Several key Cloud IoT Core features help you meet these goals, including:

  • Fast and easy setup and management: Cloud IoT Core lets you connect up to millions of globally dispersed devices into a single system with smooth and even data ingestion ensured under any condition. Devices are registered to your service quickly and easily via the industry-standard MQTT protocol. For Android Things-based devices, firmware updates can be automatic.
  • Security out-of-the-box: Secure all device data via industry-standard security protocols. (Combine Cloud IoT Core with Android Things for device operating-system security, as well.) Apply Google Cloud IAM roles to devices to control user access in a fine-grained way.
  • Native integration with analytic services: Ingest all your IoT data so you can manage it as a single system and then easily connect it to our native analytic services (including Google Cloud Dataflow, Google BigQuery and Google Cloud Machine Learning Engine) and partner BI solutions (such as Looker, Qlik, Tableau and Zoomdata). Pinpoint potential problems and uncover solutions using interactive data visualizations, or build rich machine-learning models that reflect how your business works.
  • Auto-managed infrastructure: All this in the form of a fully-managed, pay-as-you-go GCP service, with no infrastructure for you to deploy, scale or manage.
"With Google Cloud IoT Core, we have been able to connect large fleets of bicycles to the cloud and quickly build a smart transportation fleet management tool that provides operators with a real-time view of bicycle utilization, distribution and performance metrics, and it forecasts demand for our customers."
 Jose L. Ugia, VP Engineering, Noa Technologies

Next steps

Cloud IoT Core is currently available as a private beta, and we’re launching with these hardware and software partners:

Cloud IoT Device Partners
Cloud IoT Application Partners

When generally available, Cloud IoT Core will serve as an important, foundational tool for hardware partners and customers alike, offering scalability, flexibility and efficiency for a growing set of IoT use cases. In the meantime, we look forward to your feedback!

Cloud Spanner is now production-ready; let the migrations begin!



Cloud Spanner, the world’s first horizontally-scalable and strongly-consistent relational database service, is now generally available for your mission-critical OLTP applications.

We’ve carefully designed Cloud Spanner to meet customer requirements for enterprise databases — including ANSI 2011 SQL support, ACID transactions, 99.999% availability and strong consistency — without compromising latency. As a combined software/hardware solution that includes atomic clocks and GPS receivers across Google’s global network, Cloud Spanner also offers additional accuracy, reliability and performance in the form of a fully-managed cloud database service. Thanks to this unique combination of qualities, Cloud Spanner is already delivering long-term value for our customers with mission-critical applications in the cloud, including customer authentication systems, business-transaction and inventory-management systems, and high-volume media systems that require low latency and high throughput. For example, Snap uses Cloud Spanner to power part of its search infrastructure.

Looking toward migration


In preparation for general availability, we’ve been working closely with our partners to make adoption as smooth and easy as possible. Thus today, we're also announcing our initial data integration partners: Alooma, Informatica and Xplenty.

Now that these partners are in the early stages of Cloud Spanner “lift-and-shift” migration projects for customers, we asked a couple of them to pass along some of their insights about the customer value of Cloud Spanner, as well as any advice about planning for a successful migration:

From Alooma:

Cloud Spanner is a game-changer because it offers horizontally scalable, strongly consistent, highly available OLTP infrastructure in the cloud for the first time. To accelerate migrations, we recommend that customers replicate their data continuously between the source OLTP database and Cloud Spanner, thereby maintaining both infrastructures in the same state — this allows them to migrate their workloads gradually in a predictable manner.

From Informatica:
“Informatica customers are stretching the limits of latency and data volumes, and need innovative enterprise-scale capabilities to help them outperform their competition. We are excited about Cloud Spanner because it provides a completely new way for our mutual customers to disrupt their markets. For integration, migration and other use cases, we are partnering with Google to help them ingest data into Cloud Spanner and integrate a variety of heterogeneous batch, real-time, and streaming data in a highly scalable, performant and secure way.”

From Xplenty:
"Cloud Spanner is one of those cloud-based technologies for which businesses have been waiting: With its horizontal scalability and ACID compliance, it’s ideal for those who seek the lower TCO of a fully managed cloud-based service without sacrificing the features of a legacy, on-premises database. In our experience with customers migrating to Cloud Spanner, important considerations include accounting for data types, embedded code and schema definitions, as well as understanding Cloud Spanner’s security model to efficiently migrate your current security and access-control implementation."

Next steps


We encourage you to dive into a no-cost trial to experience first-hand the value of a relational database service that offers strong consistency, mission-critical availability and global scale (contact us about multi-regional instances) with no workarounds — and with no infrastructure for you to deploy, scale or manage. (Read more about Spanner’s evolution inside Google in this new paper presented at the SIGMOD ‘17 conference today.) If you like what you see, a growing partner ecosystem is standing by for migration help, and to add further value to Cloud Spanner use cases via data analytics and visualization tooling.

Students, Start Your Engineerings!


It’s that time again! Our 201 mentoring organizations have selected 1,318 the students they look forward to working with during the 13th Google Summer of Code (GSoC). Congratulations to our 2017 students and a big thank you to everyone who applied!

The next step for participating students is the Community Bonding period which runs from May 4th through May 30th. During this time, students will get up to speed on the culture and toolset of their new community. They’ll also get acquainted with their mentor and learn more about the languages or tools they will need to complete their projects. Coding commences May 30th.

To the more than 4,200 students who were not chosen this year - don’t be discouraged! Many students apply at least once to GSoC before being accepted. You can improve your odds for next time by contributing to the open source project of your choice directly; organizations are always eager for new contributors! Look around GitHub and elsewhere on the internet for a project that interests you and get started.

Happy coding, everyone!

By Cat Allman, Google Open Source

Google Cloud Natural Language API launches new features and Cloud Spanner graduating to GA



Today at Google Cloud Next London we're excited to announce product news that will help customers innovate and transform their businesses faster via the cloud: first, that Google Cloud Natural Language API is adding support for new languages and entity sentiment analysis, and second, that Google Cloud Spanner is graduating to general availability (GA).

Cloud Natural Language API beta


Since we launched Cloud Natural Language API, a fully managed service for extracting meaning from text via machine learning, we’ve seen customers such as Evernote and Ocado enhance their businesses in fascinating ways. For example, they use Cloud Natural Language API to analyze customer feedback and sentiment, extract key entities and metadata from unstructured text such as emails or web articles, and enable novel features (such as deriving action items from meeting notes).

These use cases, among many others, highlighted the need to expand language support and add improvements in the quality of our base NLU technology. We've incorporated this feedback into the product and are pleased to announce the following new capabilities under beta:

  • Expanded language support for entity, document sentiment and syntax analysis for the following languages: Chinese (Simplified and Traditional), French, German, Italian, Korean and Portuguese. This is in addition to existing support for English, Spanish and Japanese.
  • Understand sentiment for specific entities and not just whole document or sentence: We're introducing a new method that identifies entities in a block of text and also determines sentiment for those entities. Entity sentiment analysis is currently only available for the English language. For more information, see Analyzing Entity Sentiment.
  • Improved quality for sentiment and entity analysis: As part of the continuous effort to improve quality of our base models, we're also launching improved models for sentiment and entity analysis as part of this release.

Early access users of this new functionality such as Wootric are already using the expanded language support and new entity sentiment analysis feature to better understand customer sentiment around brands and products. For example, for customer feedback such as “the phone is expensive but has great battery life,” users can now parse that the sentiment for phone is negative while the sentiment for battery life is positive.

As the API becomes more widely adopted, we're looking forward to seeing more interesting and useful applications of it.

Cloud Spanner enters GA

Announced in March at Google Cloud Next ‘17, Cloud Spanner is the world’s first fully managed, horizontally scalable relational database service for mission-critical online transaction processing (OLTP) applications. Cloud Spanner is specifically designed to meet customer requirements in this area for strong consistency, high availability and global scale qualities that make it unique as a service.

During the beta period, we were thrilled to see customers unlock new use cases in the cloud with Cloud Spanner, including:

  • Powering mission-critical applications like customer authentication and provisioning for multi-national businesses
  • Building consistent systems for business transactions and inventory management in the financials services and retail industries
  • Supporting incredibly high-volume systems that need low-latency and high-throughput in the advertising and media industries

As with all our other services, GCP handles all the performance, scalability and availability needs automatically in a pay-as-you-go way.

On May 16, Cloud Spanner will reach a further milestone by becoming generally available for the first time. Currently we're offering regional instances, with multi-regional instances coming later this year. We've been Spanner users ourselves for more than five years to support a variety of mission-critical global apps, and we can’t wait to see what new workloads you bring to the cloud, and which new ones you build next!

Join the first POSSE Workshop in Europe

We are excited to announce that the Professors’ Open Source Software Experience (POSSE) is expanding to Europe! POSSE is an event that brings together educators interested in providing students with experience in real-world projects through participation in humanitarian free and open source software (HFOSS) projects.

Over 100 faculty members have attended past workshops and there is a growing community of instructors teaching students through contributions to HFOSS. This three-stage faculty workshop will prepare you to support student participation in open source projects. During the workshop, you will:

  • Learn how to support student learning within real-world project environments
  • Motivate students and cultivate their appreciation of computing for social good
  • Collaborate with instructors who have similar interests and goals
  • Join a community of educators passionate about HFOSS

Workshop Format

Stage 1: Starts May 8, 2017 with online activities. Activities will take 2-3 hours per week and include interaction with workshop instructors and participants.
Stage 2: The face-to-face workshop will be held in Bologna, Italy, July 1-2, 2017 and is a pre-event for the ACM ITiCSE conference. Workshop participants include the workshop organizers, POSSE alumni, and members of the open source community.
Stage 3: Online activities and interactions in small groups immediately following the face-to-face workshop. Participants will have support while involving students in an HFOSS project in the classroom.

How to Apply

If you’re a full-time instructor at an academic institution outside of the United States, you can join the workshop being held in Bologna, Italy, July 1-2, 2017. Please complete and submit the application by May 1, 2017. Prior work with FOSS projects is not required. English is the official language of the workshop. The POSSE workshop committee will send an email notifying you of the status of your application by May 5, 2017.

Participant Support

The POSSE workshop in Europe is supported by Google. Attendees will be provided with funding for two nights lodging ($225 USD per night) and meals during the workshop. Travel costs will also be covered up to $450 USD. Participants are responsible for any charges above these limits. At this time, we can only support instructors at institutions of higher education outside of the U.S. For faculty at U.S. institutions, the next POSSE will be in fall 2017 on the east coast of the U.S.

We look forward to seeing you at the POSSE workshop in Italy!

By Helen Hu, Open Source Programs Office

Introducing Marketing Mix Model Partners: Helping brands better understand the impact of their marketing

The following was originally posted on the Google Agency Blog.

CMOs and marketing executives use marketing mix models to understand how their marketing investments are driving sales and how to optimize their spend across multiple brands, channels, and regions. With rising investment in digital and mobile advertising, marketers want to be sure the models they use correctly value the impact of these channels.

Today we’re excited to announce a program to help marketing mix model providers better incorporate Google media data into their services. The Marketing Mix Model Partners program is designed to ensure advertisers can accurately measure the ROI of their digital investments and confidently understand the digital drivers of ROI to improve returns year-over-year.

The Marketing Mix Model Partners program offers:
  • Data Access: Partners get access to accurate, granular campaign data across all relevant Google video, display, and search media in a standardized format. We’re also making the data easier to access by providing data from multiple properties, like Search and YouTube, in one centralized location. 
  • Expertise: Partners also get dedicated training, resources, and specialists to better understand Google advertising products and practices and incorporate digital data into their model methodologies. 
  • Actionability: We provide Google account and technical teams to help advise on results and strategies designed to understand the drivers of ROI and improve returns over time. 
Our partners

We’re excited to be working with the initial participants in the program, Marketing Management Analytics, Neustar MarketShare, and Nielsen. Google customers can talk with their Google representatives about working with one of these partners on using Google data in their marketing mix model engagements.

Here’s what our partners have to say about the program:

“The ability to collect and analyze digital data at extremely granular levels enables both marketers and their advertising partners to more successfully measure, predict and action the most effective and profitable means of optimizing each digital channel to achieve their business objectives. We are excited that Google has taken such a proactive approach in working with MMA and analytic companies within the marketplace in providing such a high level of objectivity and transparency."
— Patrick Cummings, CEO of Marketing Management Analytics 
“Today’s measurement solutions need to be connected, always on and incorporate the myriad of channels, as well as critical econometric externalities in order for marketers to truly get an accurate view of marketing’s impact. We are thrilled to be a Google launch partner as this signals our commitment to helping brands understand how their marketing investments are driving business results. Through this partnership our advanced analytics models will incorporate more accurate, granular data, giving marketers a more complete understanding of the effectiveness of their marketing and how best to optimize their spend to improve future outcomes.”
— Julie Fleischer, Vice President, Product Marketing, Marketing Solutions, Neustar 
"As the marketing landscape rapidly evolves, it is critical to use the most robust data-streams in our Marketing Mix models to ensure the highest standard of insight quality. Working with Google, we will have better input and better consultative output so that our advertiser clients can best understand what is driving their performance today and make informed decisions for tomorrow.”
 ‒ Jason Tate, VP of Global Analytics at Nielsen 

As part of our commitment to providing the industry with trusted, transparent, and independent third-party metrics, we’ll be expanding the program over the coming months. If your company provides marketing mix model services and you’re interested in learning more about the partner program, please sign up here.

Google Container Engine fires up Kubernetes 1.6



Today we started to make Kubernetes 1.6 available to Google Container Engine customers. This release emphasizes significant scale improvements and additional scheduling and security options, making the running of a Kubernetes clusters on Container Engine easier than ever before.

There were over 5,000 commits in Kubernetes 1.6 with dozens of major updates that are now available to Container Engine customers. Here are just a few highlights from this release:
  • Increase in number of supported nodes by 2.5 times: We’ve made great effort to support your workload no matter how large your needs. Container Engine now supports cluster sizes of up to 5,000 nodes, up from 2,000, while still maintaining our strict SLO for cluster performance. We've already had some of the world's most popular apps hosted on Container Engine (such as Pokémon GO) and the increase in scale can handle more of the largest workloads.
  • Fully Managed Nodes: Container Engine has always helped keep your Kubernetes master in a healthy state; we're now adding the option to fully manage your Kubernetes nodes as well. With Node Auto-Upgrade and Node Auto-Repair, you can optionally have Google automatically update your cluster to the latest version, and ensure your cluster’s nodes are always operating correctly. You can read more about both features here.
  • General Availability of Container-Optimized OS: Container Engine was designed to be a secure and reliable way to run Kubernetes. By using Container-Optimized OS, a locked down operating system specifically designed for running containers on Google Cloud, we provide a default experience that's more secure, highly performant and reliable, helping ensure your containerized workloads can run great. Read more details about Container-Optimized OS in this in-depth post here.
Over the past year, Kubernetes adoption has accelerated and we could not be more proud to host so many mission critical applications on the platform for our customers. Some recent highlights include:

Customers

  • eBay uses Google Cloud technologies including Container Engine, Cloud Machine Learning and AI for its ShopBot, a personal shopping bot on Facebook Messenger.
  • Smyte participated in the Google Cloud startup program and protects millions of actions a day on websites and mobile applications. Smyte recently moved from self-hosted Kubernetes to Container Engine.
  • Poki, a game publisher startup, moved to Google Cloud Platform (GCP) for greater flexibility, empowered by the openness of Kubernetes. A theme we covered at our Google Cloud Next conference, showing that open source technology gives customers the freedom to come and go as they choose. Read more about their decision to switch here.
While Kubernetes did nudge us in the direction of GCP, we’re more cloud agnostic than ever because Kubernetes can live anywhere.”  — Bas Moeys, Co-founder and Head of Technology at Poki

To help shape the future of Kubernetes — the core technology Container Engine is built on — join the open Kubernetes community and participate via the kubernetes-users-mailing list or chat with us on the kubernetes-users Slack channel.

We’re the first cloud to offer users the newest Kubernetes release, and with our generous 12 month free trial of $300 credits, it’s never been simpler to get started, try the latest release today.