Tag Archives: Announcements

Google Cloud Platform expands to Mars



Google Cloud Platform (GCP) is committed to meeting our customers needs—no matter where they are. Amidst our growing list of new regions, today we're pleased to announce our expansion to Mars. In addition to supporting some of the most demanding disaster recovery and data sovereignty needs of our Earth-based customers, we’re looking to the future cloud infrastructure needed for the exploration and ultimate colonization of the Red Planet.
Visit Mars with Google Street View
Mars has long captured the imagination as the most hospitable planet for future colonization, and expanding to Mars has been a top priority for Google. By opening a dedicated extraterrestrial cloud region, we're bringing the power of Google’s compute, network, and storage to the rest of the solar system, unlocking a plethora of possibilities for astronomy research, exploration of Martian natural resources and interplanetary life sciences. This region will also serve as an important node in an extensive network throughout the solar system.

Our first interplanetary data center—affectionately nicknamed “Ziggy Stardust”—will open in 2018. Our Mars exploration started as a 20% project with the Google Planets team, which mapped Mars and other bodies in space and found a suitable location in Gale Crater, near the landing site of NASA’s Curiosity rover.
Explore more of Mars in Google Maps
In order to ease the transition for our Earthling customers, Google Cloud Storage (GCS) is launching a new Earth-Mars Multi-Regional location. Users can store planet-redundant data across Earth and Mars, which means even if Earth experiences another asteroid strike like the one that wiped out the dinosaurs, your cat videos, selfies and other data will still be safe. Of course, we'll also store all public domain scientific data, history and arts free of charge so that the next global catastrophe doesn't send humanity back into the dark ages.

Customers can choose to store data exclusively in the new Mars region, outside of any controlled jurisdictions on Earth, ensuring that they're both compliant with and benefit from the terms of the Outer Space Treaty. The ability to store and process data on Mars enables low-latency data analysis pipelines and consumer apps to serve the expected influx of Mars explorers and colonists. How exciting would it be to stream movies of potatoes growing right from the craters and dunes of our new frontier?

One of our early access customers says “This will be a game changer for us. With GCS, we can store all the data collected from our rovers right on Mars and run big data analytics to query exabyte-scale datasets all in a matter of seconds. Our dream of colonizing Mars by 2020 can now become a reality.”
Walk inside our new data center in Google Street View
The Martian data center will become Google’s greenest facility yet by taking full advantage of its new location. The cold weather enables natural, unpowered cooling throughout the year, while the thin atmosphere and high winds allow the entire facility to be redundantly powered by entirely renewable sources.

But why stop at Mars? We're taking a moonshot at N+42 redundancy with galaxy-scale computing. While GCP is optimized for faster-than-light data coordination for databases, the Google Planets team is already hard at work mapping the rest of our solar system for future data center locations. Stay tuned and join our journey! We can’t wait to see the problems you solve and the breakthroughs you achieve.

P.S. Check out Curiosity’s journey across the Red Planet on Mars Street View.


Google hosts the Apache HBase community at HBaseCon West 2017

We’re excited to announce that Google will host and organize HBaseCon West 2017, the official conference for the Apache HBase community on June 12. Registration for the event in Mountain View, CA is free and the call for papers (CFP) is open through April 24. Seats are limited and the CFP closes soon, so act fast.


Apache HBase is the original open source implementation of the design concepts behind Bigtable, a critical piece of internal Google data infrastructure which was first described in a 2006 research paper and earned a SIGOPS Hall of Fame award last year. Since the founding of HBase, its community has made impressive advances supporting massive scale with enterprise users including Alibaba, Apple, Facebook, and Visa. The community is fostering a rich and still-growing ecosystem including Apache Phoenix, OpenTSDB, Apache Trafodion, Apache Kylin and many others.

Now that Bigtable is available to Google Cloud users through Google Cloud Bigtable, developers have the benefit of platform choices for apps that rely on high-volume and low-latency reads and writes. Without the ability to build portable applications on open APIs,  however, even that freedom of choice can lead to a dead end  something Google addresses through its investment in open standards like Apache Beam, Kubernetes and TensorFlow.

To that end, Google’s Bigtable team has been actively participating in the HBase community. We’ve helped co-author the HBase 1.0 API and have standardized on that API in Cloud Bigtable. This design choice means developers with HBase experience don’t need to learn a new API for building cloud-native applications, ensures Cloud Bigtable users have access to the large Apache Hadoop ecosystem and alleviates concerns about long-term lock-in.

We hope you’ll join us and the HBase community at HBaseCon West 2017. We recommend registering early as there is no registration available on site. As usual, sessions are selected by the HBase community from a pool reflecting some of the world’s largest and most advanced production deployments.

Register soon or submit a paper for HBaseCon  remember, the CFP closes on April 24! We look forward to seeing you at the conference.

By Carter Page and Michael Stack, Apache HBase Project Management Committee members

This is not a test: Google Optimize now free — for everyone

Businesses often have one big question for us: How can they better understand their website visitors and deliver more relevant, engaging experiences?

To help businesses test and take action, last spring we launched our enterprise-class A/B testing and personalization product, Google Optimize 360. We saw great demand, so we made it more accessible with a free beta version last fall — and that response also exceeded our expectations, with over 250,000 users requesting an Optimize account.

Today we're very excited to announce that both Optimize and Optimize 360 are now out of beta. And Optimize is now immediately available to everyone — for free. This is not a test: You can start using it today.

Easy to implement 

A recent survey showed 45% of small and medium businesses don’t optimize their websites through A/B testing.1 The two most common reasons given were a "lack of employee resources" and "lack of knowledge to get started."

If you're part of that 45%, Optimize is a great choice for you. Optimize has many of the same features as Optimize 360. It's just right for small and medium-sized businesses who need powerful testing, but don't have the budget or team resources for an enterprise-level solution. Optimize is easy for anyone to set up. Early users of Optimize have been happy with how easy it is to use. In fact, it's built right on top of Analytics, so if you're already an Analytics user you'll add just a single line of code to get Optimize up and running. With just a few clicks more, you can start using your Analytics data to design experiments and improve the online experience for your users.

Easy to use

Worried about having to hire someone to run A/B tests on your site, or frustrated about not knowing how to do it yourself? Don't be. The Optimize visual editor allows for WYSIWYG (what-you-see-is-what-you-get) editing so you can change just about anything on your site with a drag and a drop. And more advanced users will enjoy the ability to edit raw HTML or add JavaScript or CSS rules directly in the editor.


Powerful targeting capabilities within Optimize allow you to serve the right experiences to just the right set of users. And you have flexible URL targeting capabilities to create simple or complex rules for the pages where you want your experiment to run. To find out if a targeting rule you've set will apply to a specific URL on your site, use the new Optimize URL tester. Just enter a URL and the tester will immediately tell you if that page is a match for your targeting rule.

Easy to understand

Optimize calculates results based on your existing Analytics metrics and objectives using advanced Bayesian methods, so the reporting shows you exactly what you need to know to make better and faster decisions.


We’ve also upgraded the improvement overview (see image above) to help you quickly see how an experiment affects the metrics you care about most, whether that means purchases, pageviews, session lengths, or whatever else you’re tracking in Analytics.

Easy to try 

Leading businesses are building a culture of growth that embraces the use of data and testing to improve the customer experience every day. We’re delighted to offer Optimize to everyone to help deliver better user experiences across the board.

As of today, Optimize is available in over 180 countries. (A special note for our European users: We’ve added a new data processing amendment to the Google Optimize Terms of Service that you may review in the UI and accept if you wish.) And we're not done yet: Keep an eye out for more improvements and announcements in the future.

What are you waiting for? Try it right now!

Happy Optimizing!

1Google Surveys, "Website Optimization Challenges for SMBs," Base: 506 Small/Medium Business Owners and Managers, Google Surveys Audience Panel, U.S., March 2017

A New Home for Google Open Source

Google Open Source logo
Free and open source software has been part of our technical and organizational foundation since Google’s early beginnings. From servers running the Linux kernel to an internal culture of being able to patch any other team's code, open source is part of everything we do. In return, we've released millions of lines of open source code, run programs like Google Summer of Code and Google Code-in, and sponsor open source projects and communities through organizations like Software Freedom Conservancy, the Apache Software Foundation, and many others.
Today, we’re launching opensource.google.com, a new website for Google Open Source that ties together all of our initiatives with information on how we use, release, and support open source.

This new site showcases the breadth and depth of our love for open source. It will contain the expected things: our programs, organizations we support, and a comprehensive list of open source projects we've released. But it also contains something unexpected: a look under the hood at how we "do" open source.

Helping you find interesting open source

One of the tenets of our philosophy towards releasing open source code is that "more is better." We don't know which projects will find an audience, so we help teams release code whenever possible. As a result, we have released thousands of projects under open source licenses ranging from larger products like TensorFlow, Go, and Kubernetes to smaller projects such as Light My Piano, Neuroglancer and Periph.io. Some are fully supported while others are experimental or just for fun. With so many projects spread across 100 GitHub organizations and our self-hosted Git service, it can be difficult to see the scope and scale of our open source footprint.

To provide a more complete picture, we are launching a directory of our open source projects which we will expand over time. For many of these projects we are also adding information about how they are used inside Google. In the future, we hope to add more information about project lifecycle and maturity.

How we do open source

Open source is about more than just code; it's also about community and process. Participating in open source projects and communities as a large corporation comes with its own unique set of challenges. In 2014, we helped form the TODO Group, which provides a forum to collaborate and share best practices among companies that are deeply committed to open source. Inspired by many discussions we've had over the years, today we are publishing our internal documentation for how we do open source at Google.

These docs explain the process we follow for releasing new open source projects, submitting patches to others' projects, and how we manage the open source code that we bring into the company and use ourselves. But in addition to the how, it outlines why we do things the way we do, such as why we only use code under certain licenses or why we require contributor license agreements for all patches we receive.

Our policies and procedures are informed by many years of experience and lessons we've learned along the way. We know that our particular approach to open source might not be right for everyone—there's more than one way to do open source—and so these docs should not be read as a "how-to" guide. Similar to how it can be valuable to read another engineer's source code to see how they solved a problem, we hope that others find value in seeing how we approach and think about open source at Google.

To hear a little more about the backstory of the new Google Open Source site, we invite you to listen to the latest episode from our friends at The Changelog. We hope you enjoy exploring the new site!

By Will Norris, Open Source Programs Office

The latest round of Google Open Source Peer Bonus winners

Google relies on open source software throughout our systems, much of it written by non-Googlers. We’re always looking for ways to say “thank you!” so 5 years ago we started asking Googlers to nominate open source contributors outside of the company who have made significant contributions to codebases we use or think are important. We’ve recognized more than 500 developers from 30+ countries who have contributed their time and talent to over 400 open source projects since the program’s inception in 2011.

Today we are pleased to announce the latest round of awardees, 52 individuals we’d like to recognize for their dedication to open source communities. The following is a list of everyone who gave us permission to thank them publicly:

Name Project Name Project
Philipp Hancke Adapter.js Fernando Perez Jupyter & IPython
Geoff Greer Ag Michelle Noorali Kubernetes & Helm
Dzmitry Shylovich Angular Prosper Otemuyiwa Laravel Hackathon Starter
David Kalnischkies Apt Keith Busch Linux kernel
Peter Mounce Bazel Thomas Caswell matplotlib
Yuki Yugui Sonoda Bazel Tatsuhiro Tsujikawa nghttp2
Eric Fiselier benchmark Anna Henningsen Node.js
Rob Stradling Certificate Transparency Charles Harris NumPy
Ke He Chromium Jeff Reback pandas
Daniel Micay CopperheadOS Ludovic Rousseau PCSC-Lite, CCID
Nico Huber coreboot Matti Picus PyPy
Kyösti Mälkki coreboot Salvatore Sanfilippo Redis
Jana Moudrá Dart Ralf Gommers SciPy
John Wiegley Emacs Kevin O'Connor SeaBIOS
Alex Saveau FirebaseUI-Android Sam Aaron Sonic Pi
Toke Hoiland-Jorgensen Flent Michael Tyson The Amazing Audio Engine
Hanno Böck Fuzzing Project Rob Landley Toybox
Luca Milanesio Gerrit Bin Meng U-Boot
Daniel Theophanes Go programming language Ben Noordhuis V8
Josh Snyder Go programming language Fatih Arslan vim-go
Brendan Tracey Go programming language Adam Treat WebKit
Elias Naur Go on Mobile Chris Dumez WebKit
Anthonios Partheniou Google Cloud Datalab Sean Larkin Webpack
Marcus Meissner gPhoto2 Tobias Koppers Webpack
Matt Butcher Helm Alexis La Goutte Wireshark dissector for QUIC

Congratulations to all of the awardees, past and present! Thank you for your contributions.

By Helen Hu, Open Source Programs Office

Google Summer of Code 2017 student applications are open!

Are you a university student looking to learn more about open source software development? Consider applying to Google Summer of Code (GSoC) for a chance to spend your break coding on an open source project.

vertical GSoC logo.jpg


For the 13th straight year GSoC will give students from around the world the opportunity to learn the ins and outs of open source software development while working from their home. Students will receive a stipend for their successful contributions to allow them to focus on their coding during the program.

Mentors are paired with the students to help address technical questions and to monitor their progress throughout the program. Former GSoC participants have told us that the real-world experience they’ve gained during the program has not only sharpened their technical skills, but has also boosted their confidence, broadened their professional network and enhanced their resumes.

Interested students can submit proposals on the program site now through Monday, April 3 at 16:00 UTC. The first step is to search through the 201 open source organizations and review the “Project ideas” for the organizations that appeal to you. Next, reach out to the organizations to introduce yourself and determine if your skills and interests are a good match with their organization.




Since spots are limited, we recommend writing a strong project proposal and submitting a draft early to receive feedback from the organization which will help increase your chances of selection. Our Student Manual, written by former students and mentors, provides excellent helpful advice to get you started with choosing an organization and crafting a great proposal.

For information throughout the application period and beyond, visit the Google Open Source Blog, join our Google Summer of Code discussion lists or join us on Internet Relay Chat (IRC) at #gsoc on Freenode. Be sure to read the Program Rules, Timeline and FAQ, all available on the program site, for more information about Google Summer of Code.

Good luck to all the open source coders who apply, and remember to submit your proposals early — you only have until Monday, April 3 at 16:00 UTC!

By Stephanie Taylor, Google Summer of Code Program Manager

Join us live on May 23, 2017 as we announce the latest Analytics, DoubleClick and Ads innovations

What: Google Marketing Next keynote live stream
When: Tuesday, May 23, 9:00 a.m. PT/12:00 p.m. ET.
Duration: 1 hour
Where: Here on the Google Analytics Blog

Be the first to hear about Google’s latest marketing innovations, the moment they’re announced. Watch live as my team and I share new Ads, Analytics and DoubleClick innovations designed to improve your ability to reach consumers, simplify campaign measurement and increase your productivity. We’ll also give you a sneak peek at how brands are starting to use the Google Assistant to delight customers.

Register for the live stream here.

Until then, follow us on Twitter, Google+, Facebook and LinkedIn for previews of what’s to come.

Getting ready for Google Summer of Code 2017

Spring is just around the corner here in the Northern Hemisphere and Google Summer of Code is fast approaching. If you are a student interested in participating this year, now is the time to prepare -- read on for tips on how to get ready.

This year we’ve accepted 201 open source organizations into the program, nearly 40 of which are new to the program. The organizations cover a wide range of topics including (but certainly not limited to!):

  • Operating systems
  • Web application frameworks
  • Healthcare and bioinformatics
  • Music and graphic design
  • Machine learning
  • Robotics
  • Security




How should you prepare for Google Summer of Code?

While student applications don’t open until March 20th at 16:00 UTC, you need to decide which projects you’re interested in and what you’ll propose. You should also communicate with those projects to learn more before you apply.

Start by looking at the list of participating projects and organizations. You can explore by searching for specific names or technologies, or filtering by topics you are interested in. Follow the “Learn More” link through to each organization’s page for additional information.

Once you’ve identified the organizations that you’re interested in, take a look at their ideas list to get a sense of the specific projects you could work on. Typically, you will choose a project from that list and write a proposal based on that idea, but you could also propose something that’s not on that list.

You should reach out to the organizations after you’ve decided what you want to work on. Doing this can make the difference between a good application and a great application.

Whatever you do, don’t wait until March 20th to begin preparing for Google Summer of Code! History has shown that students who reach out to organizations before the start of the application period have a higher chance of being accepted into the program, as they have had more time to talk to the organizations and understand what they are looking for with the project.

If you have any questions along the way, take a look at the Student Manual, FAQ and Timeline. If you can’t find the answer to your question, try taking your question to the mailing list.

By Josh Simmons, Open Source Programs Office

Google Cloud Platform: your Next home in the cloud



San Francisco Today at Google Cloud Next ‘17, we’re thrilled to announce new Google Cloud Platform (GCP) products, technologies and services that will help you imagine, build and run the next generation of cloud applications on our platform.

Bring your code to App Engine, we’ll handle the rest

In 2008, we launched Google App Engine, a pioneering serverless runtime environment that lets developers build web apps, APIs and mobile backends at Google-scale and speed. For nearly 10 years, some of the most innovative companies built applications that serve their users all over the world on top of App Engine. Today, we’re excited to announce into general availability a major expansion of App Engine centered around openness and developer choice that keeps App Engine’s original promise to developers: bring your code, we’ll handle the rest.

App Engine now supports Node.js, Ruby, Java 8, Python 2.7 or 3.5, Go 1.8, plus PHP 7.1 and .NET Core, both in beta, all backed by App Engine’s 99.95% SLA. Our managed runtimes make it easy to start with your favorite languages and use the open source libraries and packages of your choice. Need something different than what’s out of the box? Break the glass and go beyond our managed runtimes by supplying your own Docker container, which makes it simple to run any language, library or framework on App Engine.

The future of cloud is open: take your app to-go by having App Engine generate a Docker container containing your app and deploy it to any container-based environment, on or off GCP. App Engine gives developers an open platform while still providing a fully managed environment where developers focus only on code and on their users.


Cloud Functions public beta at your service

Up one level from fully managed applications, we’re launching Google Cloud Functions into public beta. Cloud Functions is a completely serverless environment to build and connect cloud services without having to manage infrastructure. It’s the smallest unit of compute offered by GCP and is able to spin up a single function and spin it back down instantly. Because of this, billing occurs only while the function is executing, metered to the nearest one hundred milliseconds.

Cloud Functions is a great way to build lightweight backends, and to extend the functionality of existing services. For example, Cloud Functions can respond to file changes in Google Cloud Storage or incoming Google Cloud Pub/Sub messages, perform lightweight data processing/ETL jobs or provide a layer of logic to respond to webhooks emitted by any event on the internet. Developers can securely invoke Cloud Functions directly over HTTP right out of the box without the need for any add-on services.

Cloud Functions is also a great option for mobile developers using Firebase, allowing them to build backends integrated with the Firebase platform. Cloud Functions for Firebase handles events emitted from the Firebase Realtime Database, Firebase Authentication and Firebase Analytics.

Growing the Google BigQuery universe: introducing BigQuery Data Transfer Service

Since our earliest days, our customers turned to Google to promote their advertising messages around the world, at a scale that was previously unimaginable. Today, those same customers want to use BigQuery, our powerful data analytics service, to better understand how users interact with those campaigns. With that, we’ve developed deeper integration between broader Google and GCP with the public beta of the BigQuery Data Transfer Service, which automates data movement from select Google applications directly into BigQuery. With BigQuery Data Transfer Service, marketing and business analysts can easily export data from Adwords, DoubleClick and YouTube directly into BigQuery, making it available for immediate analysis and visualization using the extensive set of tools in the BigQuery ecosystem.

Slashing data preparation time with Google Cloud Dataprep

In fact, our goal is to make it easy to import data into BigQuery, while keeping it secure. Google Cloud Dataprep is a new serverless browser-based service that can dramatically cut the time it takes to prepare data for analysis, which represents about 80% of the work that data scientists do. It intelligently connects to your data source, identifies data types, identifies anomalies and suggests data transformations. Data scientists can then visualize their data schemas until they're happy with the proposed data transformation. Dataprep then creates a data pipeline in Google Cloud Dataflow, cleans the data and exports it to BigQuery or other destinations. In other words, you can now prepare structured and unstructured data for analysis with clicks, not code. For more information on Dataprep, apply to be part of the private beta. Also, you’ll find more news about our latest database and data and analytics capabilities here and here.

Hello, (more) world

Not only are we working hard on bringing you new products and capabilities, but we want your users to access them quickly and securely  wherever they may be. That’s why we’re announcing three new Google Cloud Platform regions: California, Montreal and the Netherlands. These will bring the total number of Google Cloud regions up from six today, to more than 17 locations in the future. These new regions will deliver lower latency for customers in adjacent geographic areas, increased scalability and more disaster recovery options. Like other Google Cloud regions, the new regions will feature a minimum of three zones, benefit from Google’s global, private fibre network and offer a complement of GCP services.

Supercharging our infrastructure . . .

Customers run demanding workloads on GCP, and we're constantly striving to improve the performance of our VMs. For instance, we were honored to be the first public cloud provider to run Intel Skylake, a custom Xeon chip that delivers significant enhancements for compute-heavy workloads and a larger range of VM memory and CPU options.

We’re also doubling the number of vCPUs you can run in an instance from 32 to 64 and now offering up to 416GB of memory, which customers have asked us for as they move large enterprise applications to Google Cloud. Meanwhile, we recently began offering GPUs, which provide substantial performance improvements to parallel workloads like training machine learning models.

To continually unlock new energy sources, Schlumberger collects large quantities of data to build detailed subsurface earth models based on acoustic measurements, and GCP compute infrastructure has the unique characteristics that match Schlumberger's needs to turn this data into insights. High performance scientific computing is integral to its business, so GCP's flexibility is critical.

Schlumberger can mix and match GPUs and CPUs and dynamically create different shapes and types of virtual machines, choosing memory and storage options on demand.

"We are now leveraging the strengths offered by cloud computation stacks to bring our data processing to the next level. Ashok Belani, Executive Vice President Technology, Schlumberger

. . . without supercharging our prices

We aim to keep costs low. Today we announced Committed Use Discounts that provide up to 57% off the list price on Google Compute Engine, in exchange for a one or three year purchase commitment. Committed Use Discounts are based on the total amount of CPU and RAM you purchase, and give you the flexibility to use different instance and machine types; they apply automatically, even if you change instance types (or size). There are no upfront costs with Committed Use Discounts, and they are billed monthly. What’s more, we automatically apply Sustained Use Discounts to any additional usage above a commitment.

We're also dropping prices for Compute Engine. The specific cuts vary by region. Customers in the United States will see a 5% price drop; customers in Europe will see a 4.9% drop and customers using our Tokyo region an 8% drop.

Then there’s our improved Free Tier. First, we’ve extended the free trial from 60 days to 12 months, allowing you to use your $300 credit across all GCP services and APIs, at your own pace and on your own schedule. Second, we’re introducing new Always Free products  non-expiring usage limits that you can use to test and develop applications at no cost. New additions include Compute Engine, Cloud Pub/Sub, Google Cloud Storage and Cloud Functions, bringing the number of Always Free products up to 15, and broadening the horizons for developers getting started on GCP. Visit the Google Cloud Platform Free Tier page today for further details, terms, eligibility and to sign up.

We'll be diving into all of these product announcements in much more detail in the coming days, so stay tuned!

Real-time just got real: Google Analytics 360 offers fresher insight

You’ve just launched a website or feature. Your toe is already tapping. Wait, wait, wait — you can hardly wait one hour to see exactly how it’s performing. Sound familiar? If you’ve been there, we have exciting news for you.

Google Analytics 360 can now provide updated insights as quickly as every 10 minutes. We’re proud to give our customers the fastest access to the freshest first party data Google Analytics has ever offered.

What did you just say?!
If you need to know how your sites, microsites, or digital engagements are doing right now, we’ve got you covered. Most first-party data in Analytics 360 can now be collected, processed, and available — via our UI, API, and BigQuery integration (coming soon) — in as fast as 10 minutes. This means you can move faster to:
  • Fix things when they’re broken
  • Detect trends and react when things are popular
  • Understand and take action on the impact of cultural events or social memes
To see how fresh the data is in your report at any time, just look for this icon in the upper right:
When you see this icon, it means you’re looking at today’s data and the report is supported and super fresh. Hover over the icon to see how fresh the data is!

This new level of freshness is only available to Analytics 360 users. To learn more about which reports, views, and properties support fresher data, and the factors affecting data freshness, check out our help center.

Some site owners just can’t wait
Brands and sites in the business of capitalizing on momentary consumer attention are excited about fresher insights. Take the case of publishers and retailers as an example.

Publishers strive to put the richest, most interesting content in front of users at any given point in time. The trick is understanding what’s rich and interesting right now — and that’s a constantly moving target.

Publishers have long referenced our real-time Google Analytics reports to make decisions, but sometimes they’re looking for deeper insight than what is provided in those reports. Fresher insights across additional Google Analytics reports help our publishers make even more informed content decisions, paving the way to better user acquisition, user engagement, and a stronger relationship between content consumer and publisher brand.

Online retailers are in the same boat. When celebrities wear a product or mention a brand on social media, product interest may spike. Retailers may have just minutes to capitalize on purchase intent before it wanes.

When a product’s popularity is on the rise, retailers can react by upping its prominence to capture interest, running focused promotions or recommending related products to expand consideration. With fresh insights available as soon as every 10 minutes, retailers move faster and turn trending interest into sales.

Speed is good, but safety comes first
As you know, Google Analytics has the ability to pull in data from other sources like AdWords and DoubleClick. We refer to these as “integration sources” and these sources operate with additional requirements, like fraud detection, that mean that the data in these reports are exempt from our enhanced freshness capabilities.

For example, any report with Ads data, including a dimension widened by an Ads integration, will continue to be made available within hours. For further details on which reports are supported or not supported, please read the help center article here.