How AI is making information more useful

Today, there’s more information accessible at people’s fingertips than at any point in human history. And advances in artificial intelligence will radically transform the way we use that information, with the ability to uncover new insights that can help us both in our daily lives and in the ways we are able to tackle complex global challenges.


At our Search On livestream event today, we shared how we’re bringing the latest in AI to Google’s products, giving people new ways to search and explore information in more natural and intuitive ways.


Making multimodal search possible with MUM
Earlier this year at Google I/O, we announced we’ve reached a critical milestone for understanding information with Multitask Unified Model, or MUM for short.


We’ve been experimenting with using MUM’s capabilities to make our products more helpful and enable entirely new ways to search. Today, we’re sharing an early look at what will be possible with MUM.


In the coming months, we’ll introduce a new way to search visually, with the ability to ask questions about what you see. Here are a couple of examples of what will be possible with MUM.
With this new capability, you can tap on the Lens icon when you’re looking at a picture of a shirt, and ask Google to find you the same pattern — but on another article of clothing, like socks. This helps when you’re looking for something that might be difficult to describe accurately with words alone. You could type “white floral Victorian socks,” but you might not find the exact pattern you’re looking for. By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural ways.
Some questions are even trickier: Your bike has a broken thingamajig, and you need some guidance on how to fix it. Instead of poring over catalogs of parts and then looking for a tutorial, the point-and-ask mode of searching will make it easier to find the exact moment in a video that can help.


Helping you explore with a redesigned Search page
We’re also announcing how we’re applying AI advances like MUM to redesign Google Search. These new features are the latest steps we’re taking to make searching more natural and intuitive.


First, we’re making it easier to explore and understand new topics with “Things to know.” Let’s say you want to decorate your apartment, and you’re interested in learning more about creating acrylic paintings.
If you search for “acrylic painting,” Google understands how people typically explore this topic, and shows the aspects people are likely to look at first. For example, we can identify more than 350 topics related to acrylic painting, and help you find the right path to take.


We’ll be launching this feature in the coming months. In the future, MUM will unlock deeper insights you might not have known to search for — like “how to make acrylic paintings with household items” — and connect you with content on the web that you wouldn’t have otherwise found.
Second, to help you further explore ideas, we’re making it easy to zoom in and out of a topic with new features to refine and broaden searches.


In this case, you can learn more about specific techniques, like puddle pouring, or art classes you can take. You can also broaden your search to see other related topics, like other painting methods and famous painters. These features will launch in the coming months.
Third, we’re making it easier to find visual inspiration with a newly designed, browsable results page. If puddle pouring caught your eye, just search for “pour painting ideas" to see a visually rich page full of ideas from across the web, with articles, images, videos and more that you can easily scroll through.

This new visual results page is designed for searches that are looking for inspiration, like “Halloween decorating ideas” or “indoor vertical garden ideas,” and you can try it today.

Get more from videos
We already use advanced AI systems to identify key moments in videos, like the winning shot in a basketball game, or steps in a recipe. Today, we’re taking this a step further, introducing a new experience that identifies related topics in a video, with links to easily dig deeper and learn more.
Using MUM, we can even show related topics that aren’t explicitly mentioned in the video, based on our advanced understanding of information in the video. In this example, while the video doesn’t say the words “macaroni penguin’s life story,” our systems understand that topics contained in the video relate to this topic, like how macaroni penguins find their family members and navigate predators. The first version of this feature will roll out in the coming weeks, and we’ll add more visual enhancements in the coming months.


Across all these MUM experiences, we look forward to helping people discover more web pages, videos, images and ideas that they may not have come across or otherwise searched for.

A more helpful Google
The updates we’re announcing today don’t end with MUM, though. We’re also making it easier to shop from the widest range of merchants, big and small, no matter what you’re looking for. And we’re helping people better evaluate the credibility of information they find online. Plus, for the moments that matter most, we’re finding new ways to help people get access to information and insights.


All this work not only helps people around the world, but creators, publishers and businesses as well. Every day, we send visitors to well over 100 million different websites, and every month, Google connects people with more than 120 million businesses that don't have websites, by enabling phone calls, driving directions and local foot traffic.


As we continue to build more useful products and push the boundaries of what it means to search, we look forward to helping people find the answers they’re looking for, and inspiring more questions along the way.



Posted by Prabhakar Raghavan, Senior Vice President

How AI is making information more useful

Today, there’s more information accessible at people’s fingertips than at any point in human history. And advances in artificial intelligence will radically transform the way we use that information, with the ability to uncover new insights that can help us both in our daily lives and in the ways we are able to tackle complex global challenges.


At our Search On livestream event today, we shared how we’re bringing the latest in AI to Google’s products, giving people new ways to search and explore information in more natural and intuitive ways.


Making multimodal search possible with MUM

Earlier this year at Google I/O, we announced we’ve reached a critical milestone for understanding information with Multitask Unified Model, or MUM for short.


We’ve been experimenting with using MUM’s capabilities to make our products more helpful and enable entirely new ways to search. Today, we’re sharing an early look at what will be possible with MUM. 


In the coming months, we’ll introduce a new way to search visually, with the ability to ask questions about what you see. Here are a couple of examples of what will be possible with MUM.




With this new capability, you can tap on the Lens icon when you’re looking at a picture of a shirt, and ask Google to find you the same pattern — but on another article of clothing, like socks. This helps when you’re looking for something that might be difficult to describe accurately with words alone. You could type “white floral Victorian socks,” but you might not find the exact pattern you’re looking for. By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural ways. 



Some questions are even trickier: Your bike has a broken thingamajig, and you need some guidance on how to fix it. Instead of poring over catalogs of parts and then looking for a tutorial, the point-and-ask mode of searching will make it easier to find the exact moment in a video that can help.


Helping you explore with a redesigned Search page

We’re also announcing how we’re applying AI advances like MUM to redesign Google Search. These new features are the latest steps we’re taking to make searching more natural and intuitive.


First, we’re making it easier to explore and understand new topics with “Things to know.” Let’s say you want to decorate your apartment, and you’re interested in learning more about creating acrylic paintings.



If you search for “acrylic painting,” Google understands how people typically explore this topic, and shows the aspects people are likely to look at first. For example, we can identify more than 350 topics related to acrylic painting, and help you find the right path to take.


We’ll be launching this feature in the coming months. In the future, MUM will unlock deeper insights you might not have known to search for — like “how to make acrylic paintings with household items” — and connect you with content on the web that you wouldn’t have otherwise found.

Second, to help you further explore ideas, we’re making it easy to zoom in and out of a topic with new features to refine and broaden searches. 


In this case, you can learn more about specific techniques, like puddle pouring, or art classes you can take. You can also broaden your search to see other related topics, like other painting methods and famous painters. These features will launch in the coming months.


Third, we’re making it easier to find visual inspiration with a newly designed, browsable results page. If puddle pouring caught your eye, just search for “pour painting ideas" to see a visually rich page full of ideas from across the web, with articles, images, videos and more that you can easily scroll through. 

This new visual results page is designed for searches that are looking for inspiration, like “Halloween decorating ideas” or “indoor vertical garden ideas,” and you can try it today.

Get more from videos

We already use advanced AI systems to identify key moments in videos, like the winning shot in a basketball game, or steps in a recipe. Today, we’re taking this a step further, introducing a new experience that identifies related topics in a video, with links to easily dig deeper and learn more. 


Using MUM, we can even show related topics that aren’t explicitly mentioned in the video, based on our advanced understanding of information in the video. In this example, while the video doesn’t say the words “macaroni penguin’s life story,” our systems understand that topics contained in the video relate to this topic, like how macaroni penguins find their family members and navigate predators. The first version of this feature will roll out in the coming weeks, and we’ll add more visual enhancements in the coming months.


Across all these MUM experiences, we look forward to helping people discover more web pages, videos, images and ideas that they may not have come across or otherwise searched for. 


A more helpful Google

The updates we’re announcing today don’t end with MUM, though. We’re also making it easier to shop from the widest range of merchants, big and small, no matter what you’re looking for. And we’re helping people better evaluate the credibility of information they find online. Plus, for the moments that matter most, we’re finding new ways to help people get access to information and insights. 


All this work not only helps people around the world, but creators, publishers and businesses as well.  Every day, we send visitors to well over 100 million different websites, and every month, Google connects people with more than 120 million businesses that don't have websites, by enabling phone calls, driving directions and local foot traffic.


As we continue to build more useful products and push the boundaries of what it means to search, we look forward to helping people find the answers they’re looking for, and inspiring more questions along the way.


Posted by Prabhakar Raghavan, Senior Vice President




How 5 cities plan to use Tree Canopy to fight climate change

Planting trees in cities helps provide shade, lower temperatures and contribute to cleaner air — all of which are huge benefits when it comes to adapting to the effects of climate change. That’s why we’re expanding our Environmental Insights Explorer Tree Canopy insights to more than 100 cities around the world next year, helping local governments fight climate change. We chatted with city officials in Los Angeles, Louisville, Chicago, Austin and Miami to learn more about how they plan to use Tree Canopy insights to build thriving, sustainable cities in 2021 and beyond.

Los Angeles

An image showing tree canopy coverage in Los Angeles

Tree canopy coverage in Los Angeles

Los Angeles was the first city to pilot Tree Canopy Insights. Since then it’s become an essential part of the city’s goal to increase tree canopy coverage by 50% by 2028 in areas of the city with the highest need. The city is working to plant 90,000 trees this year, and Tree Canopy Insights helps them prioritize which neighborhoods need tree shade the most.Rachel Malarich, Los Angeles’ City Forest Officer, and her team use Tree Canopy Insights alongside their inventory system to look at canopy acreage projections, current canopy cover and temperatures. The land use types within the tool allows them to consider the type of outreach needed and opportunities that exist in a given neighborhood. Most importantly, it helps Rachel and her team know which program initiatives are working and which aren’t.

“Tree Canopy Insights’ ability to give us timely feedback allows me to have data to make arguments for changes to the City's policies and procedures, as well as  potentially see the impact of different outreach activities going forward.” - Rachel Malarich, Los Angeles City Forest Officer

Louisville


An image showing tree canopy coverage in Louisville

Tree canopy coverage in Louisville

Similar to other cities, Louisville officials found that monitoring tree coverage on their own was hugely expensive and time intensive. Sometimes it took years to get the accurate, up-to-date data needed to make decisions. 

With Tree Canopy Insights, they’ve been able to glean actionable insights about tree cover faster. In just a few weeks, they’ve pinpointed that the west side of town was losing tree shade at an unprecedented rate and jump started a plan to plant more trees in the area. 

“Planting trees is one of the simplest ways we can reduce the impacts and slow the progress of climate change on our city. With support from Google’s Tree Canopy Insights, Louisville can enhance its ongoing surveillance of hot spots and heat islands and understand the impact of land use and development patterns on tree canopy coverage.“ – Louisville Mayor Greg Fischer

Austin

An image showing tree canopy coverage in Austin

Tree canopy coverage in Austin

Austin’s summers are hot with the heat regularly reaching over 90 degrees. Using Tree Canopy Insights, Marc Coudert, an environmental program manager for the city, noticed a troubling trend: ambient temperatures were higher in the eastern part of the city, known as the Eastern Crescent. With these insights, Marc and the City’s forestry team developed Austin’s Community Tree Priority Map and doubled down on planting trees in neighborhoods in the Eastern Crescent to make sure there was equitable tree canopy coverage across the city. 

“At the city of Austin, we’re committed to making data-backed decisions that bring equity to all of our communities. Google’s Tree Canopy Insights empowers us to do exactly that.” - Austin Mayor Steve Adler

Chicago

An image showing tree canopy coverage in Chicago

Tree canopy coverage in Chicago

Chicago’s Department of Public Health understands that planting trees is an essential part of promoting health and racial equity. After all, a lack of trees can be associated with chronic diseases like asthma, heart disease and mental health conditions. With Tree Canopy Insights, the department discovered that their hottest neighborhoods are often also the most disadvantaged — making these communities extremely vulnerable. With the use of this tool, the City of Chicago is committed to focusing their tree planting efforts specifically on these high-risk areas. 

"Trees not only provide our city with shade, green spaces and beauty, but they are also precious resources that produce clean air — making them key to shaping our sustainable future. Through this partnership with Google, our sustainability and public health teams will have access to real-time insights on our tree coverage that will inform how we develop and execute our equitable approach to building a better Chicago landscape. I look forward to seeing how this technology uses our city's natural resources to benefit all of our residents."  - Chicago Mayor Lori E. Lightfoot.

Miami

An image showing tree canopy coverage in Miami

Tree canopy coverage in Miami

Miami gets over 60 inches of rain per year, leading to potentially devastating effects from flooding and infrastructure damage. To address this, the city recently launched their Stormwater Master Plan. The multi-year initiative has already resulted in over 4,000 trees planted, translating to an additional 400,000 gallons of water absorption capacity per day. Moving forward, the city plans to use Tree Canopy Insights to evolve and improve this plan.

“Google’s Tree Canopy Insights is going to help us build on the progress of our Stormwater Master Plan in smarter, more effective ways. We believe that every city needs to be a “tech city,” and leveraging Google’s AI capabilities to improve every Miamians quality of life is exactly what I mean by that.” – Miami Mayor Francis Suarez

If you’re part of a local government and think Tree Canopy Insights could help your community, please get in touch with our team by filling out this form.

How AI is making information more useful

Today, there’s more information accessible at people’s fingertips than at any point in human history. And advances in artificial intelligence will radically transform the way we use that information, with the ability to uncover new insights that can help us both in our daily lives and in the ways we are able to tackle complex global challenges.

At our Search On livestream event today, we shared how we’re bringing the latest in AI to Google’s products, giving people new ways to search and explore information in more natural and intuitive ways.


Making multimodal search possible with MUM

Earlier this year at Google I/O, we announced we’ve reached a critical milestone for understanding information with Multitask Unified Model, or MUM for short.

We’ve been experimenting with using MUM’s capabilities to make our products more helpful and enable entirely new ways to search. Today, we’re sharing an early look at what will be possible with MUM. 

In the coming months, we’ll introduce a new way to search visually, with the ability to ask questions about what you see. Here are a couple of examples of what will be possible with MUM.

Animated GIF showing how you can tap on the Lens icon when you’re looking at a picture of a shirt, and ask Google to find you the same pattern — but on another article of clothing, like socks.

With this new capability, you can tap on the Lens icon when you’re looking at a picture of a shirt, and ask Google to find you the same pattern — but on another article of clothing, like socks. This helps when you’re looking for something that might be difficult to describe accurately with words alone. You could type “white floral Victorian socks,” but you might not find the exact pattern you’re looking for. By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural ways.

Animated GIF showing the point-and-ask mode of searching that can make it easier to find the exact moment in a video that can help you with instructions on fixing your bike.

Some questions are even trickier: Your bike has a broken thingamajig, and you need some guidance on how to fix it. Instead of poring over catalogs of parts and then looking for a tutorial, the point-and-ask mode of searching will make it easier to find the exact moment in a video that can help.


Helping you explore with a redesigned Search page

We’re also announcing how we’re applying AI advances like MUM to redesign Google Search. These new features are the latest steps we’re taking to make searching more natural and intuitive.

First, we’re making it easier to explore and understand new topics with “Things to know.” Let’s say you want to decorate your apartment, and you’re interested in learning more about creating acrylic paintings.

The search results page for the query “acrylic painting” that scrolls to a new feature called “Things to know”, which lists out various aspects of the topic like, “step by step”, “styles” and “using household items."

If you search for “acrylic painting,” Google understands how people typically explore this topic, and shows the aspects people are likely to look at first. For example, we can identify more than 350 topics related to acrylic painting, and help you find the right path to take.

We’ll be launching this feature in the coming months. In the future, MUM will unlock deeper insights you might not have known to search for — like “how to make acrylic paintings with household items” — and connect you with content on the web that you wouldn’t have otherwise found.

Two phone screens side by side highlight a set of queries and tappable features that allow you to refine to more specific searches for acrylic painting or broaden to concepts like famous painters.

Second, to help you further explore ideas, we’re making it easy to zoom in and out of a topic with new features to refine and broaden searches. 

In this case, you can learn more about specific techniques, like puddle pouring, or art classes you can take. You can also broaden your search to see other related topics, like other painting methods and famous painters. These features will launch in the coming months.

A scrolling results page for the query “pour painting ideas” that shows results with bold images and video thumbnails.

Third, we’re making it easier to find visual inspiration with a newly designed, browsable results page. If puddle pouring caught your eye, just search for “pour painting ideas" to see a visually rich page full of ideas from across the web, with articles, images, videos and more that you can easily scroll through. 

This new visual results page is designed for searches that are looking for inspiration, like “Halloween decorating ideas” or “indoor vertical garden ideas,” and you can try it today.

Get more from videos

We already use advanced AI systems to identify key moments in videos, like the winning shot in a basketball game, or steps in a recipe. Today, we’re taking this a step further, introducing a new experience that identifies related topics in a video, with links to easily dig deeper and learn more. 

Using MUM, we can even show related topics that aren’t explicitly mentioned in the video, based on our advanced understanding of information in the video. In this example, while the video doesn’t say the words “macaroni penguin’s life story,” our systems understand that topics contained in the video relate to this topic, like how macaroni penguins find their family members and navigate predators. The first version of this feature will roll out in the coming weeks, and we’ll add more visual enhancements in the coming months.

Across all these MUM experiences, we look forward to helping people discover more web pages, videos, images and ideas that they may not have come across or otherwise searched for. 

A more helpful Google

The updates we’re announcing today don’t end with MUM, though. We’re also making it easier to shop from the widest range of merchants, big and small, no matter what you’re looking for. And we’re helping people better evaluate the credibility of information they find online. Plus, for the moments that matter most, we’re finding new ways to help people get access to information and insights. 

All this work not only helps people around the world, but creators, publishers and businesses as well.  Every day, we send visitors to well over 100 million different websites, and every month, Google connects people with more than 120 million businesses that don't have websites, by enabling phone calls, driving directions and local foot traffic.

As we continue to build more useful products and push the boundaries of what it means to search, we look forward to helping people find the answers they’re looking for, and inspiring more questions along the way.

How AI is making information more useful

Today, there’s more information accessible at people’s fingertips than at any point in human history. And advances in artificial intelligence will radically transform the way we use that information, with the ability to uncover new insights that can help us both in our daily lives and in the ways we are able to tackle complex global challenges.

At our Search On livestream event today, we shared how we’re bringing the latest in AI to Google’s products, giving people new ways to search and explore information in more natural and intuitive ways.


Making multimodal search possible with MUM

Earlier this year at Google I/O, we announced we’ve reached a critical milestone for understanding information with Multitask Unified Model, or MUM for short.

We’ve been experimenting with using MUM’s capabilities to make our products more helpful and enable entirely new ways to search. Today, we’re sharing an early look at what will be possible with MUM. 

In the coming months, we’ll introduce a new way to search visually, with the ability to ask questions about what you see. Here are a couple of examples of what will be possible with MUM.

Animated GIF showing how you can tap on the Lens icon when you’re looking at a picture of a shirt, and ask Google to find you the same pattern — but on another article of clothing, like socks.

With this new capability, you can tap on the Lens icon when you’re looking at a picture of a shirt, and ask Google to find you the same pattern — but on another article of clothing, like socks. This helps when you’re looking for something that might be difficult to describe accurately with words alone. You could type “white floral Victorian socks,” but you might not find the exact pattern you’re looking for. By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural ways.

Animated GIF showing the point-and-ask mode of searching that can make it easier to find the exact moment in a video that can help you with instructions on fixing your bike.

Some questions are even trickier: Your bike has a broken thingamajig, and you need some guidance on how to fix it. Instead of poring over catalogs of parts and then looking for a tutorial, the point-and-ask mode of searching will make it easier to find the exact moment in a video that can help.


Helping you explore with a redesigned Search page

We’re also announcing how we’re applying AI advances like MUM to redesign Google Search. These new features are the latest steps we’re taking to make searching more natural and intuitive.

First, we’re making it easier to explore and understand new topics with “Things to know.” Let’s say you want to decorate your apartment, and you’re interested in learning more about creating acrylic paintings.

The search results page for the query “acrylic painting” that scrolls to a new feature called “Things to know”, which lists out various aspects of the topic like, “step by step”, “styles” and “using household items."

If you search for “acrylic painting,” Google understands how people typically explore this topic, and shows the aspects people are likely to look at first. For example, we can identify more than 350 topics related to acrylic painting, and help you find the right path to take.

We’ll be launching this feature in the coming months. In the future, MUM will unlock deeper insights you might not have known to search for — like “how to make acrylic paintings with household items” — and connect you with content on the web that you wouldn’t have otherwise found.

Two phone screens side by side highlight a set of queries and tappable features that allow you to refine to more specific searches for acrylic painting or broaden to concepts like famous painters.

Second, to help you further explore ideas, we’re making it easy to zoom in and out of a topic with new features to refine and broaden searches. 

In this case, you can learn more about specific techniques, like puddle pouring, or art classes you can take. You can also broaden your search to see other related topics, like other painting methods and famous painters. These features will launch in the coming months.

A scrolling results page for the query “pour painting ideas” that shows results with bold images and video thumbnails.

Third, we’re making it easier to find visual inspiration with a newly designed, browsable results page. If puddle pouring caught your eye, just search for “pour painting ideas" to see a visually rich page full of ideas from across the web, with articles, images, videos and more that you can easily scroll through. 

This new visual results page is designed for searches that are looking for inspiration, like “Halloween decorating ideas” or “indoor vertical garden ideas,” and you can try it today.

Get more from videos

We already use advanced AI systems to identify key moments in videos, like the winning shot in a basketball game, or steps in a recipe. Today, we’re taking this a step further, introducing a new experience that identifies related topics in a video, with links to easily dig deeper and learn more. 

Using MUM, we can even show related topics that aren’t explicitly mentioned in the video, based on our advanced understanding of information in the video. In this example, while the video doesn’t say the words “macaroni penguin’s life story,” our systems understand that topics contained in the video relate to this topic, like how macaroni penguins find their family members and navigate predators. The first version of this feature will roll out in the coming weeks, and we’ll add more visual enhancements in the coming months.

Across all these MUM experiences, we look forward to helping people discover more web pages, videos, images and ideas that they may not have come across or otherwise searched for. 

A more helpful Google

The updates we’re announcing today don’t end with MUM, though. We’re also making it easier to shop from the widest range of merchants, big and small, no matter what you’re looking for. And we’re helping people better evaluate the credibility of information they find online. Plus, for the moments that matter most, we’re finding new ways to help people get access to information and insights. 

All this work not only helps people around the world, but creators, publishers and businesses as well.  Every day, we send visitors to well over 100 million different websites, and every month, Google connects people with more than 120 million businesses that don't have websites, by enabling phone calls, driving directions and local foot traffic.

As we continue to build more useful products and push the boundaries of what it means to search, we look forward to helping people find the answers they’re looking for, and inspiring more questions along the way.

Helpful Search tools for evaluating information online

Whether you’re looking for facts about the COVID vaccine or information on how to apply for a loan, having access to relevant, credible information is crucial. People turn to Google for trustworthy, high quality results -- especially when it matters most.  

That’s why we design our ranking systems to prioritize the most useful, highest quality content and provide direct access to reliable information for important topics. We’re also looking into new ways to give you more context about the information you find online, and introducing more information literacy features, based on research and best practices from experts. 

More insights from About This Result

Earlier this year, we launched the About This Result feature, which provides details about a website before you visit it, including its description, when it was first indexed and whether your connection to the site is secure. In the coming weeks, we’re expanding these panels to help you learn more about the sources and topics you find on Search. 

We’re bringing new and important insights to About This Result. When you tap the three dots on any search result, you’ll be able to learn more about the page. You can: 

  • See more information about the source: In addition to seeing a source description from Wikipedia, you’ll also be able to read what a site says about itself in its own words, when that information is available.
  • Find what others on the web have said about a site: Reading what others on the web have written about a site -- news, reviews and other helpful background context -- can help you better evaluate sources.
  • Learn more about the topic: In the “About the topic” section, you can find information such as top news coverage or results about the same topic from other sources.

People don’t just come to Google looking for quick facts. They often really want to explore the information that’s out there, and learn about where it’s coming from — especially in situations where there’s a source they may not be familiar with. We want to make it easier to evaluate information with this update to About This Result, which will be rolling out in the coming weeks in English in the United States. And we’re working to bring About This Result to more countries around the world. 

Phone screen showing content advisory for rapidly changing results.

Empowering you with context

There are a range of other Google tools that help people evaluate the credibility of information online. For instance, we make it easy to spot fact checks published by independent, authoritative sources on the web. We highlight relevant fact checks on results in Search, News and Google Images. These fact check features have received billions of impressions in Search this year alone.

One of the most important pieces of context we can provide is letting you know when helpful or relevant information isn’t available on the web just yet. This could be true in a rapidly evolving event, where interest in a topic can often travel faster than the facts. Or when relevant information simply doesn’t exist for your search. In these moments, we alert you with a notice recommending that you check back later or try another search. 

With each of these tools, our goal is to offer simple, useful ways for you to evaluate and make sense of the information you find online. We’ll continue to look for new ways to improve and add to these features and make sure they’re easy to find and use. 

New ways to find shopping inspiration on Google

Shopping online is as much about inspiration and discovery as it is about the final purchase. People are shopping across Google more than a billion times a day, and we have been working to make those experiences even more helpful by expanding your options. We’re here to help you find new ideas, discover unique products or get the best value from the widest possible range of merchants — from large retailers, marketplaces and well-known brands, to local stores and new direct-to-consumer companies. 


We’ve made a number of changes over the last couple of years to improve your shopping experience, including giving you more choice when you shop on Google. For example, we’ve made it free for merchants to list on Google and made it easy for sellers on Shopify and other digital platforms to  start selling on Google, so their products and inventory are discoverable for shoppers.


And today, we’re adding new tools to make it easier for shoppers to browse for inspiration, find new products and brands and ultimately find what they’re looking for in a more visual way. 


Shop in the moment with Google Lens 

We know that inspiration can strike at any time. Whether it’s an image that you see online, a photo you saved on your phone or something in the real world that catches your eye, Google Lens makes the products you see instantly shoppable. 

Starting soon, iOS users will see a new button in the Google app to make all the images on a page searchable through Google Lens. Now, finding this lamp or that shirt (and ones like it) is just a tap away.

We’re also bringing Lens to Chrome on your desktop. Soon, you will be able to select images, video and text content on a website with Lens to quickly see search results in the same tab — without leaving the page you’re on.

Looking at ApartmentTherapy.com from the Google app for iOS, tap the “search images” button at the bottom in order to see information about the products on the screen, as well as similar products.

Shop in the moment with Lens in the Google app for iOS

Window shop right from Search 

Starting today, we’re making it easier to browse for clothing, shoes and accessories on mobile right from your Search results. For example, when you search for “cropped jackets,” we’ll show you a visual feed of jackets in various colors and styles, alongside other helpful information like local shops, style guides and videos. From there, you can easily filter your search by style, department, brand and more – and when you find something you like, you can check out ratings, reviews and even compare prices to get the best deal. 

This new experience is powered by Google’s Shopping Graph, a comprehensive, real-time dataset of products, inventory, and merchants with more than 24 billion listings. This not only helps us connect shoppers with the right products for them, it also helps millions of merchants and brands get discovered on Google every day.

A search for “cropped jackets” shows a visual, scrollable results page with products and helpful information like styling guides

Browse and explore options for cropped jackets on mobile right from Search

Search in-store inventory from home

Shoppers are increasingly starting their in-person shopping experience online. Before heading out the door, you can find local stores that carry the products you want right from Search. And starting today, when you are looking for products like “kids bike helmet” or even a specific brand, you can select the “in stock” filter to see only the nearby stores that have it on their shelves.

A search for “kids bike helmet near me” using the new “in stock” filter shows retailers in San Francisco with kids bike helmets on their shelves, clicking into Mike’s Bikes of San Francisco.

You can now use the “in stock” filter to see only the nearby stores with a specific item on their shelves.

Showing in-store availability is especially valuable for small businesses, helping them attract new local customers. For example, a mother-daughter duo in Greenville, South Carolina discovered local toy store Hollipops Fine Toys and Gifts after searching for “squishmallows” near them. Check out their story (and find out what a "squishmallow" is).

Throughout the entire process — from the first spark of an idea, to the final purchase — Google is helping to breathe new life into shopping experiences.

New ways to find shopping inspiration on Google

Shopping online is as much about inspiration and discovery as it is about the final purchase. People are shopping across Google more than a billion times a day, and we have been working to make those experiences even more helpful by expanding your options. We’re here to help you find new ideas, discover unique products or get the best value from the widest possible range of merchants — from large retailers, marketplaces and well-known brands, to local stores and new direct-to-consumer companies. 


We’ve made a number of changes over the last couple of years to improve your shopping experience, including giving you more choice when you shop on Google. For example, we’ve made it free for merchants to list on Google and made it easy for sellers on Shopify and other digital platforms to  start selling on Google, so their products and inventory are discoverable for shoppers.


And today, we’re adding new tools to make it easier for shoppers to browse for inspiration, find new products and brands and ultimately find what they’re looking for in a more visual way. 


Shop in the moment with Google Lens 

We know that inspiration can strike at any time. Whether it’s an image that you see online, a photo you saved on your phone or something in the real world that catches your eye, Google Lens makes the products you see instantly shoppable. 

Starting soon, iOS users will see a new button in the Google app to make all the images on a page searchable through Google Lens. Now, finding this lamp or that shirt (and ones like it) is just a tap away.

We’re also bringing Lens to Chrome on your desktop. Soon, you will be able to select images, video and text content on a website with Lens to quickly see search results in the same tab — without leaving the page you’re on.

Looking at ApartmentTherapy.com from the Google app for iOS, tap the “search images” button at the bottom in order to see information about the products on the screen, as well as similar products.

Shop in the moment with Lens in the Google app for iOS

Window shop right from Search 

Starting today, we’re making it easier to browse for clothing, shoes and accessories on mobile right from your Search results. For example, when you search for “cropped jackets,” we’ll show you a visual feed of jackets in various colors and styles, alongside other helpful information like local shops, style guides and videos. From there, you can easily filter your search by style, department, brand and more – and when you find something you like, you can check out ratings, reviews and even compare prices to get the best deal. 

This new experience is powered by Google’s Shopping Graph, a comprehensive, real-time dataset of products, inventory, and merchants with more than 24 billion listings. This not only helps us connect shoppers with the right products for them, it also helps millions of merchants and brands get discovered on Google every day.

A search for “cropped jackets” shows a visual, scrollable results page with products and helpful information like styling guides

Browse and explore options for cropped jackets on mobile right from Search

Search in-store inventory from home

Shoppers are increasingly starting their in-person shopping experience online. Before heading out the door, you can find local stores that carry the products you want right from Search. And starting today, when you are looking for products like “kids bike helmet” or even a specific brand, you can select the “in stock” filter to see only the nearby stores that have it on their shelves.

A search for “kids bike helmet near me” using the new “in stock” filter shows retailers in San Francisco with kids bike helmets on their shelves, clicking into Mike’s Bikes of San Francisco.

You can now use the “in stock” filter to see only the nearby stores with a specific item on their shelves.

Showing in-store availability is especially valuable for small businesses, helping them attract new local customers. For example, a mother-daughter duo in Greenville, South Carolina discovered local toy store Hollipops Fine Toys and Gifts after searching for “squishmallows” near them. Check out their story (and find out what a "squishmallow" is).

Throughout the entire process — from the first spark of an idea, to the final purchase — Google is helping to breathe new life into shopping experiences.