Category Archives: Google Webmaster Central Blog

Official news on crawling and indexing sites for the Google index

The Rich Results Test is out of beta

Today we are announcing that the Rich Results Test fully supports all Google Search rich result features - it’s out of beta 🥳. In addition, we are preparing to deprecate the Structured Data Testing Tool 👋 - it will still be available for the time being, but going forward we strongly recommend you use the Rich Results Test to test and validate your structured data.

Rich results are experiences on Google Search that go beyond the standard blue link. They’re powered by structured data and can include carousels, images, or other non-textual elements. Over the last couple years we’ve developed the Rich Results Test to help you test your structured data and preview your rich results.

Here are some reasons the new tool will serve you better:
  • It shows which Search feature enhancements are valid for the markup you are providing
  • It handles dynamically loaded structured data markup more effectively
  • It renders both mobile and desktop versions of a result
  • It is fully aligned with Search Console reports
You can use the Rich Results Test to test a code snippet or a URL to a page. The test returns errors and warnings we detect on your page. Note that errors disqualify your page from showing up as a rich result. While warnings might limit the appearance, your page is still eligible for showing up as a rich result. For example, if there was a warning for a missing image property, that page could still appear as a rich result, just without an image.

Here are some examples of what you’ll see when using the tool.

Image: valid structured data on Rich Results Test

Image: code explorer showing error on Rich Results Test

Image: search preview on Rich Results Test

Learn more about the Rich Results Test, and let us know if you have any feedback either through the Webmasters help community or Twitter.

Posted by Moshe Samet, Search Console Product Manager

The Rich Results Test is out of beta

Today we are announcing that the Rich Results Test fully supports all Google Search rich result features - it’s out of beta 🥳. In addition, we are preparing to deprecate the Structured Data Testing Tool 👋 - it will still be available for the time being, but going forward we strongly recommend you use the Rich Results Test to test and validate your structured data.

Rich results are experiences on Google Search that go beyond the standard blue link. They’re powered by structured data and can include carousels, images, or other non-textual elements. Over the last couple years we’ve developed the Rich Results Test to help you test your structured data and preview your rich results.

Here are some reasons the new tool will serve you better:
  • It shows which Search feature enhancements are valid for the markup you are providing
  • It handles dynamically loaded structured data markup more effectively
  • It renders both mobile and desktop versions of a result
  • It is fully aligned with Search Console reports
You can use the Rich Results Test to test a code snippet or a URL to a page. The test returns errors and warnings we detect on your page. Note that errors disqualify your page from showing up as a rich result. While warnings might limit the appearance, your page is still eligible for showing up as a rich result. For example, if there was a warning for a missing image property, that page could still appear as a rich result, just without an image.

Here are some examples of what you’ll see when using the tool.

Image: valid structured data on Rich Results Test

Image: code explorer showing error on Rich Results Test

Image: search preview on Rich Results Test

Learn more about the Rich Results Test, and let us know if you have any feedback either through the Webmasters help community or Twitter.

Posted by Moshe Samet, Search Console Product Manager

How spam reports are used at Google

Thanks to our users, we receive hundreds of spam reports every day. While many of the spam reports lead to manual actions, they represent a small fraction of the manual actions we issue. Most of the manual actions come from the work our internal teams regularly do to detect spam and improve search results. Today we're updating our Help Center articles to better reflect this approach: we use spam reports only to improve our spam detection algorithms.

Spam reports play a significant role: they help us understand where our automated spam detection systems may be missing coverage. Most of the time, it's much more impactful for us to fix an underlying issue with our automated detection systems than it is to take manual action on a single URL or site.

In theory, if our automated systems were perfect, we would catch all spam and not need reporting systems at all. The reality is that while our spam detection systems work well, there’s always room for improvement, and spam reporting is a crucial resource to help us with that. Spam reports in aggregate form help us analyze trends and patterns in spammy content to improve our algorithms.

Overall, one of the best approaches to keeping spam out of Search is to rely on high quality content created by the web community and our ability to surface it through ranking. You can learn more about our approach to improving Search and generating great results at our How Search Works site. Content owners and creators can also learn how to create high-quality content to be successful in Search through our Google Webmasters resources. Our spam detection systems work with our regular ranking systems, and spam reports help us continue to improve both so we very much appreciate them.

If you have any questions or comments, please let us know on Twitter.

Posted by Gary

How spam reports are used at Google

Thanks to our users, we receive hundreds of spam reports every day. While many of the spam reports lead to manual actions, they represent a small fraction of the manual actions we issue. Most of the manual actions come from the work our internal teams regularly do to detect spam and improve search results. Today we're updating our Help Center articles to better reflect this approach: we use spam reports only to improve our spam detection algorithms.

Spam reports play a significant role: they help us understand where our automated spam detection systems may be missing coverage. Most of the time, it's much more impactful for us to fix an underlying issue with our automated detection systems than it is to take manual action on a single URL or site.

In theory, if our automated systems were perfect, we would catch all spam and not need reporting systems at all. The reality is that while our spam detection systems work well, there’s always room for improvement, and spam reporting is a crucial resource to help us with that. Spam reports in aggregate form help us analyze trends and patterns in spammy content to improve our algorithms.

Overall, one of the best approaches to keeping spam out of Search is to rely on high quality content created by the web community and our ability to surface it through ranking. You can learn more about our approach to improving Search and generating great results at our How Search Works site. Content owners and creators can also learn how to create high-quality content to be successful in Search through our Google Webmasters resources. Our spam detection systems work with our regular ranking systems, and spam reports help us continue to improve both so we very much appreciate them.

If you have any questions or comments, please let us know on Twitter.

Posted by Gary

How spam reports are used at Google

Thanks to our users, we receive hundreds of spam reports every day. While many of the spam reports lead to manual actions, they represent a small fraction of the manual actions we issue. Most of the manual actions come from the work our internal teams regularly do to detect spam and improve search results. Today we're updating our Help Center articles to better reflect this approach: we use spam reports only to improve our spam detection algorithms.

Spam reports play a significant role: they help us understand where our automated spam detection systems may be missing coverage. Most of the time, it's much more impactful for us to fix an underlying issue with our automated detection systems than it is to take manual action on a single URL or site.

In theory, if our automated systems were perfect, we would catch all spam and not need reporting systems at all. The reality is that while our spam detection systems work well, there’s always room for improvement, and spam reporting is a crucial resource to help us with that. Spam reports in aggregate form help us analyze trends and patterns in spammy content to improve our algorithms.

Overall, one of the best approaches to keeping spam out of Search is to rely on high quality content created by the web community and our ability to surface it through ranking. You can learn more about our approach to improving Search and generating great results at our How Search Works site. Content owners and creators can also learn how to create high-quality content to be successful in Search through our Google Webmasters resources. Our spam detection systems work with our regular ranking systems, and spam reports help us continue to improve both so we very much appreciate them.

If you have any questions or comments, please let us know on Twitter.

Posted by Gary

How we fought Search spam on Google – Webspam Report 2019



Every search matters. That is why whenever you come to Google Search to find relevant and useful information, it is our ongoing commitment to make sure users receive the highest quality results possible.

Unfortunately, on the web there are some disruptive behaviors and content that we call "webspam" that can degrade the experience for people coming to find helpful information. We have a number of teams who work to prevent webspam from appearing in your search results, and it’s a constant challenge to stay ahead of the spammers. At the same time, we continue to engage with webmasters to ensure they’re following best practices and can find success on Search, making great content available on the open web.

Looking back at last year, here’s a snapshot of how we fought spam on Search in 2019, and how we supported the webmaster community.

Fighting Spam at Scale

With hundreds of billions of webpages in our index serving billions of queries every day, perhaps it’s not too surprising that there continue to be bad actors who try to manipulate search ranking. In fact, we observed that more than 25 Billion pages we discover each day are spammy. That’s a lot of spam and it goes to show the scale, persistence, and the lengths that spammers are willing to go. We’re very serious about making sure that your chance of encountering spammy pages in Search is as small as possible. Our efforts have helped ensure that more than 99% of visits from our results lead to spam-free experiences.

Updates from last year

In 2018, we reported that we had reduced user-generated spam by 80%, and we’re happy to confirm that this type of abuse did not grow in 2019. Link spam continued to be a popular form of spam, but our team was successful in containing its impact in 2019. More than 90% of link spam was caught by our systems, and techniques such as paid links or link exchange have been made less effective.

Hacked spam, while still a commonly observed challenge, has been more stable compared to previous years. We continued to work on solutions to better detect and notify affected webmasters and platforms and help them recover from hacked websites.

Spam Trends

One of our top priorities in 2019 was improving our spam fighting capabilities through machine learning systems. Our machine learning solutions, combined with our proven and time-tested manual enforcement capability, have been instrumental in identifying and preventing spammy results from being served to users.

In the last few years, we’ve observed an increase in spammy sites with auto-generated and scraped content with behaviors that annoy or harm searchers, such as fake buttons, overwhelming ads, suspicious redirects and malware. These websites are often deceptive and offer no real value to people. In 2019, we were able to reduce the impact on Search users from this type of spam by more than 60% compared to 2018.

As we improve our capability and efficiency in catching spam, we’re continuously investing in reducing broader types of harm, like scams and fraud. These sites trick people into thinking they’re visiting an official or authoritative site and in many cases, people can end up disclosing sensitive personal information, losing money, or infecting their devices with malware. We have been paying close attention to queries that are prone to scam and fraud and we’ve worked to stay ahead of spam tactics in those spaces to protect users.

Working with webmasters and developers for a better web

Much of the work we do to fight against spam is using automated systems to detect spammy behavior, but those systems aren’t perfect and can’t catch everything. As someone who uses Search, you can also help us fight spam and other issues by reporting spam on search, phishing or malware. We received nearly 230,000 reports of search spam in 2019, and we were able to take action on 82% of those reports we processed. We appreciate all the reports you sent to us and your help in keeping search results clean!

So what do we do when we get those reports or identify that something isn’t quite right? An important part of what we do is notifying webmasters when we detect something wrong with their website. In 2019, we generated more than 90 million messages to website owners to let them know about issues, problems that may affect their site’s appearance on Search results and potential improvements that can be implemented. Of all messages, about 4.3 million were related to manual actions, resulting from violations of our Webmaster Guidelines.

And we’re always looking for ways to better help site owners. There were many initiatives in 2019 aimed at improving communications, such as the new Search Console messages, Site Kit for WordPress sites or the Auto-DNS verification in the new Search Console. We hope that these initiatives have equipped webmasters with more convenient ways to get their sites verified and will continue to be helpful. We also hope this provides quicker access to news and that webmasters will be able to fix webspam issues or hack issues more effectively and efficiently.

While we deeply focused on cleaning up spam, we also didn’t forget to keep up with the evolution of the web and rethought how we wanted to treat “nofollow” links. Originally introduced as a means to help fight comment spam and annotate sponsored links, the “nofollow” attribute has come a long way. But we’re not stopping there. We believe it’s time for it to evolve even more, just as how our spam fighting capability has evolved. We introduced two new link attributes, rel="sponsored" and rel="ugc", that provide webmasters with additional ways to identify to Google Search the nature of particular links. Along with rel="nofollow", we began treating these as hints for us to incorporate for ranking purposes. We are very excited to see that these new rel attributes were well received and adopted by webmasters around the world!

Engaging with the community

As always, we’re grateful for all the opportunities we had last year to connect with webmasters around the world, helping them improve their presence in Search and hearing feedback. We delivered more than 150 online office hours, online events and offline events in many cities across the globe to a wide range of audience including SEOs, developers, online marketers and business owners. Among those events, we have been delighted by the momentum behind our Webmaster Conferences in 35 locations across 15 countries and 12 languages around the world, including the first Product Summit version in Mountain View. While we’re not currently able to host in-person events, we look forward to more of these events and virtual touchpoints in the future.

Webmasters continued to find solutions and tips on our Webmasters Help Community with more than 30,000 threads in 2019 in more than a dozen languages. On YouTube, we launched #AskGoogleWebmasters as well as series such as SEO mythbusting to ensure that your questions get answered and your uncertainties get clarified.

We know that our journey to better web with you is ongoing and we would love to continue this with you in the year to come! Therefore, do keep in touch on Twitter, YouTube, blog, Help Community or see you in person at one of our conferences near you!





How we fought Search spam on Google – Webspam Report 2019



Every search matters. That is why whenever you come to Google Search to find relevant and useful information, it is our ongoing commitment to make sure users receive the highest quality results possible.

Unfortunately, on the web there are some disruptive behaviors and content that we call "webspam" that can degrade the experience for people coming to find helpful information. We have a number of teams who work to prevent webspam from appearing in your search results, and it’s a constant challenge to stay ahead of the spammers. At the same time, we continue to engage with webmasters to ensure they’re following best practices and can find success on Search, making great content available on the open web.

Looking back at last year, here’s a snapshot of how we fought spam on Search in 2019, and how we supported the webmaster community.

Fighting Spam at Scale

With hundreds of billions of webpages in our index serving billions of queries every day, perhaps it’s not too surprising that there continue to be bad actors who try to manipulate search ranking. In fact, we observed that more than 25 Billion pages we discover each day are spammy. That’s a lot of spam and it goes to show the scale, persistence, and the lengths that spammers are willing to go. We’re very serious about making sure that your chance of encountering spammy pages in Search is as small as possible. Our efforts have helped ensure that more than 99% of visits from our results lead to spam-free experiences.

Updates from last year

In 2018, we reported that we had reduced user-generated spam by 80%, and we’re happy to confirm that this type of abuse did not grow in 2019. Link spam continued to be a popular form of spam, but our team was successful in containing its impact in 2019. More than 90% of link spam was caught by our systems, and techniques such as paid links or link exchange have been made less effective.

Hacked spam, while still a commonly observed challenge, has been more stable compared to previous years. We continued to work on solutions to better detect and notify affected webmasters and platforms and help them recover from hacked websites.

Spam Trends

One of our top priorities in 2019 was improving our spam fighting capabilities through machine learning systems. Our machine learning solutions, combined with our proven and time-tested manual enforcement capability, have been instrumental in identifying and preventing spammy results from being served to users.

In the last few years, we’ve observed an increase in spammy sites with auto-generated and scraped content with behaviors that annoy or harm searchers, such as fake buttons, overwhelming ads, suspicious redirects and malware. These websites are often deceptive and offer no real value to people. In 2019, we were able to reduce the impact on Search users from this type of spam by more than 60% compared to 2018.

As we improve our capability and efficiency in catching spam, we’re continuously investing in reducing broader types of harm, like scams and fraud. These sites trick people into thinking they’re visiting an official or authoritative site and in many cases, people can end up disclosing sensitive personal information, losing money, or infecting their devices with malware. We have been paying close attention to queries that are prone to scam and fraud and we’ve worked to stay ahead of spam tactics in those spaces to protect users.

Working with webmasters and developers for a better web

Much of the work we do to fight against spam is using automated systems to detect spammy behavior, but those systems aren’t perfect and can’t catch everything. As someone who uses Search, you can also help us fight spam and other issues by reporting spam on search, phishing or malware. We received nearly 230,000 reports of search spam in 2019, and we were able to take action on 82% of those reports we processed. We appreciate all the reports you sent to us and your help in keeping search results clean!

So what do we do when we get those reports or identify that something isn’t quite right? An important part of what we do is notifying webmasters when we detect something wrong with their website. In 2019, we generated more than 90 million messages to website owners to let them know about issues, problems that may affect their site’s appearance on Search results and potential improvements that can be implemented. Of all messages, about 4.3 million were related to manual actions, resulting from violations of our Webmaster Guidelines.

And we’re always looking for ways to better help site owners. There were many initiatives in 2019 aimed at improving communications, such as the new Search Console messages, Site Kit for WordPress sites or the Auto-DNS verification in the new Search Console. We hope that these initiatives have equipped webmasters with more convenient ways to get their sites verified and will continue to be helpful. We also hope this provides quicker access to news and that webmasters will be able to fix webspam issues or hack issues more effectively and efficiently.

While we deeply focused on cleaning up spam, we also didn’t forget to keep up with the evolution of the web and rethought how we wanted to treat “nofollow” links. Originally introduced as a means to help fight comment spam and annotate sponsored links, the “nofollow” attribute has come a long way. But we’re not stopping there. We believe it’s time for it to evolve even more, just as how our spam fighting capability has evolved. We introduced two new link attributes, rel="sponsored" and rel="ugc", that provide webmasters with additional ways to identify to Google Search the nature of particular links. Along with rel="nofollow", we began treating these as hints for us to incorporate for ranking purposes. We are very excited to see that these new rel attributes were well received and adopted by webmasters around the world!

Engaging with the community

As always, we’re grateful for all the opportunities we had last year to connect with webmasters around the world, helping them improve their presence in Search and hearing feedback. We delivered more than 150 online office hours, online events and offline events in many cities across the globe to a wide range of audience including SEOs, developers, online marketers and business owners. Among those events, we have been delighted by the momentum behind our Webmaster Conferences in 35 locations across 15 countries and 12 languages around the world, including the first Product Summit version in Mountain View. While we’re not currently able to host in-person events, we look forward to more of these events and virtual touchpoints in the future.

Webmasters continued to find solutions and tips on our Webmasters Help Community with more than 30,000 threads in 2019 in more than a dozen languages. On YouTube, we launched #AskGoogleWebmasters as well as series such as SEO mythbusting to ensure that your questions get answered and your uncertainties get clarified.

We know that our journey to better web with you is ongoing and we would love to continue this with you in the year to come! Therefore, do keep in touch on Twitter, YouTube, blog, Help Community or see you in person at one of our conferences near you!





Evaluating page experience for a better web

Through both internal studies and industry research, users show they prefer sites with a great page experience. In recent years, Search has added a variety of user experience criteria, such as how quickly pages load and mobile-friendliness, as factors for ranking results. Earlier this month, the Chrome team announced Core Web Vitals, a set of metrics related to speed, responsiveness and visual stability, to help site owners measure user experience on the web.

Today, we’re building on this work and providing an early look at an upcoming Search ranking change that incorporates these page experience metrics. We will introduce a new signal that combines Core Web Vitals with our existing signals for page experience to provide a holistic picture of the quality of a user’s experience on a web page.

As part of this update, we'll also incorporate the page experience metrics into our ranking criteria for the Top Stories feature in Search on mobile, and remove the AMP requirement from Top Stories eligibility. Google continues to support AMP, and will continue to link to AMP pages when available. We’ve also updated our developer tools to help site owners optimize their page experience.

A note on timing: We recognize many site owners are rightfully placing their focus on responding to the effects of COVID-19. The ranking changes described in this post will not happen before next year, and we will provide at least six months notice before they’re rolled out. We're providing the tools now to get you started (and because site owners have consistently requested to know about ranking changes as early as possible), but there is no immediate need to take action.

About page experience

The page experience signal measures aspects of how users perceive the experience of interacting with a web page. Optimizing for these factors makes the web more delightful for users across all web browsers and surfaces, and helps sites evolve towards user expectations on mobile. We believe this will contribute to business success on the web as users grow more engaged and can transact with less friction.

Core Web Vitals are a set of real-world, user-centered metrics that quantify key aspects of the user experience. They measure dimensions of web usability such as load time, interactivity, and the stability of content as it loads (so you don’t accidentally tap that button when it shifts under your finger - how annoying!).

We're combining the signals derived from Core Web Vitals with our existing Search signals for page experience, including mobile-friendliness, safe-browsing, HTTPS-security, and intrusive interstitial guidelines, to provide a holistic picture of page experience. Because we continue to work on identifying and measuring aspects of page experience, we plan to incorporate more page experience signals on a yearly basis to both further align with evolving user expectations and increase the aspects of user experience that we can measure.

A diagram illustrating the components of Search’s signal for page experience.

Page experience ranking

Great page experiences enable people to get more done and engage more deeply; in contrast, a bad page experience could stand in the way of a person being able to find the valuable information on a page. By adding page experience to the hundreds of signals that Google considers when ranking search results, we aim to help people more easily access the information and web pages they’re looking for, and support site owners in providing an experience users enjoy.

For some developers, understanding how their sites measure on the Core Web Vitals—and addressing noted issues—will require some work. To help out, we’ve updated popular developer tools such as Lighthouse and PageSpeed Insights to surface Core Web Vitals information and recommendations, and Google Search Console provides a dedicated report to help site owners quickly identify opportunities for improvement. We’re also working with external tool developers to bring Core Web Vitals into their offerings.

While all of the components of page experience are important, we will prioritize pages with the best information overall, even if some aspects of page experience are subpar. A good page experience doesn’t override having great, relevant content. However, in cases where there are multiple pages that have similar content, page experience becomes much more important for visibility in Search.

Page experience and the mobile Top Stories feature


The mobile Top Stories feature is a premier fresh content experience in Search that currently emphasizes AMP results, which have been optimized to exhibit a good page experience. Over the past several years, Top Stories has inspired new thinking about the ways we could promote better page experiences across the web.

When we roll out the page experience ranking update, we will also update the eligibility criteria for the Top Stories experience. AMP will no longer be necessary for stories to be featured in Top Stories on mobile; it will be open to any page. Alongside this change, page experience will become a ranking factor in Top Stories, in addition to the many factors assessed. As before, pages must meet the Google News content policies to be eligible. Site owners who currently publish pages as AMP, or with an AMP version, will see no change in behavior – the AMP version will be what’s linked from Top Stories. 

Summary


We believe user engagement will improve as experiences on the web get better -- and that by incorporating these new signals into Search, we'll help make the web better for everyone. We hope that sharing our roadmap for the page experience updates and launching supporting tools ahead of time will help the diverse ecosystem of web creators, developers, and businesses to improve and deliver more delightful user experiences. 

Please stay tuned for our future updates that will communicate more specific guidance on the timing for these changes to come into effect. As always, if you have any questions or feedback, please visit our webmaster forums.

Evaluating page experience for a better web

Through both internal studies and industry research, users show they prefer sites with a great page experience. In recent years, Search has added a variety of user experience criteria, such as how quickly pages load and mobile-friendliness, as factors for ranking results. Earlier this month, the Chrome team announced Core Web Vitals, a set of metrics related to speed, responsiveness and visual stability, to help site owners measure user experience on the web.

Today, we’re building on this work and providing an early look at an upcoming Search ranking change that incorporates these page experience metrics. We will introduce a new signal that combines Core Web Vitals with our existing signals for page experience to provide a holistic picture of the quality of a user’s experience on a web page.

As part of this update, we'll also incorporate the page experience metrics into our ranking criteria for the Top Stories feature in Search on mobile, and remove the AMP requirement from Top Stories eligibility. Google continues to support AMP, and will continue to link to AMP pages when available. We’ve also updated our developer tools to help site owners optimize their page experience.

A note on timing: We recognize many site owners are rightfully placing their focus on responding to the effects of COVID-19. The ranking changes described in this post will not happen before next year, and we will provide at least six months notice before they’re rolled out. We're providing the tools now to get you started (and because site owners have consistently requested to know about ranking changes as early as possible), but there is no immediate need to take action.

About page experience

The page experience signal measures aspects of how users perceive the experience of interacting with a web page. Optimizing for these factors makes the web more delightful for users across all web browsers and surfaces, and helps sites evolve towards user expectations on mobile. We believe this will contribute to business success on the web as users grow more engaged and can transact with less friction.

Core Web Vitals are a set of real-world, user-centered metrics that quantify key aspects of the user experience. They measure dimensions of web usability such as load time, interactivity, and the stability of content as it loads (so you don’t accidentally tap that button when it shifts under your finger - how annoying!).

We're combining the signals derived from Core Web Vitals with our existing Search signals for page experience, including mobile-friendliness, safe-browsing, HTTPS-security, and intrusive interstitial guidelines, to provide a holistic picture of page experience. Because we continue to work on identifying and measuring aspects of page experience, we plan to incorporate more page experience signals on a yearly basis to both further align with evolving user expectations and increase the aspects of user experience that we can measure.

A diagram illustrating the components of Search’s signal for page experience.

Page experience ranking

Great page experiences enable people to get more done and engage more deeply; in contrast, a bad page experience could stand in the way of a person being able to find the valuable information on a page. By adding page experience to the hundreds of signals that Google considers when ranking search results, we aim to help people more easily access the information and web pages they’re looking for, and support site owners in providing an experience users enjoy.

For some developers, understanding how their sites measure on the Core Web Vitals—and addressing noted issues—will require some work. To help out, we’ve updated popular developer tools such as Lighthouse and PageSpeed Insights to surface Core Web Vitals information and recommendations, and Google Search Console provides a dedicated report to help site owners quickly identify opportunities for improvement. We’re also working with external tool developers to bring Core Web Vitals into their offerings.

While all of the components of page experience are important, we will prioritize pages with the best information overall, even if some aspects of page experience are subpar. A good page experience doesn’t override having great, relevant content. However, in cases where there are multiple pages that have similar content, page experience becomes much more important for visibility in Search.

Page experience and the mobile Top Stories feature


The mobile Top Stories feature is a premier fresh content experience in Search that currently emphasizes AMP results, which have been optimized to exhibit a good page experience. Over the past several years, Top Stories has inspired new thinking about the ways we could promote better page experiences across the web.

When we roll out the page experience ranking update, we will also update the eligibility criteria for the Top Stories experience. AMP will no longer be necessary for stories to be featured in Top Stories on mobile; it will be open to any page. Alongside this change, page experience will become a ranking factor in Top Stories, in addition to the many factors assessed. As before, pages must meet the Google News content policies to be eligible. Site owners who currently publish pages as AMP, or with an AMP version, will see no change in behavior – the AMP version will be what’s linked from Top Stories. 

Summary


We believe user engagement will improve as experiences on the web get better -- and that by incorporating these new signals into Search, we'll help make the web better for everyone. We hope that sharing our roadmap for the page experience updates and launching supporting tools ahead of time will help the diverse ecosystem of web creators, developers, and businesses to improve and deliver more delightful user experiences. 

Please stay tuned for our future updates that will communicate more specific guidance on the timing for these changes to come into effect. As always, if you have any questions or feedback, please visit our webmaster forums.

Evaluating page experience for a better web

Through both internal studies and industry research, users show they prefer sites with a great page experience. In recent years, Search has added a variety of user experience criteria, such as how quickly pages load and mobile-friendliness, as factors for ranking results. Earlier this month, the Chrome team announced Core Web Vitals, a set of metrics related to speed, responsiveness and visual stability, to help site owners measure user experience on the web.

Today, we’re building on this work and providing an early look at an upcoming Search ranking change that incorporates these page experience metrics. We will introduce a new signal that combines Core Web Vitals with our existing signals for page experience to provide a holistic picture of the quality of a user’s experience on a web page.

As part of this update, we'll also incorporate the page experience metrics into our ranking criteria for the Top Stories feature in Search on mobile, and remove the AMP requirement from Top Stories eligibility. Google continues to support AMP, and will continue to link to AMP pages when available. We’ve also updated our developer tools to help site owners optimize their page experience.

A note on timing: We recognize many site owners are rightfully placing their focus on responding to the effects of COVID-19. The ranking changes described in this post will not happen before next year, and we will provide at least six months notice before they’re rolled out. We're providing the tools now to get you started (and because site owners have consistently requested to know about ranking changes as early as possible), but there is no immediate need to take action.

About page experience

The page experience signal measures aspects of how users perceive the experience of interacting with a web page. Optimizing for these factors makes the web more delightful for users across all web browsers and surfaces, and helps sites evolve towards user expectations on mobile. We believe this will contribute to business success on the web as users grow more engaged and can transact with less friction.

Core Web Vitals are a set of real-world, user-centered metrics that quantify key aspects of the user experience. They measure dimensions of web usability such as load time, interactivity, and the stability of content as it loads (so you don’t accidentally tap that button when it shifts under your finger - how annoying!).

We're combining the signals derived from Core Web Vitals with our existing Search signals for page experience, including mobile-friendliness, safe-browsing, HTTPS-security, and intrusive interstitial guidelines, to provide a holistic picture of page experience. Because we continue to work on identifying and measuring aspects of page experience, we plan to incorporate more page experience signals on a yearly basis to both further align with evolving user expectations and increase the aspects of user experience that we can measure.

A diagram illustrating the components of Search’s signal for page experience.

Page experience ranking

Great page experiences enable people to get more done and engage more deeply; in contrast, a bad page experience could stand in the way of a person being able to find the valuable information on a page. By adding page experience to the hundreds of signals that Google considers when ranking search results, we aim to help people more easily access the information and web pages they’re looking for, and support site owners in providing an experience users enjoy.

For some developers, understanding how their sites measure on the Core Web Vitals—and addressing noted issues—will require some work. To help out, we’ve updated popular developer tools such as Lighthouse and PageSpeed Insights to surface Core Web Vitals information and recommendations, and Google Search Console provides a dedicated report to help site owners quickly identify opportunities for improvement. We’re also working with external tool developers to bring Core Web Vitals into their offerings.

While all of the components of page experience are important, we will prioritize pages with the best information overall, even if some aspects of page experience are subpar. A good page experience doesn’t override having great, relevant content. However, in cases where there are multiple pages that have similar content, page experience becomes much more important for visibility in Search.

Page experience and the mobile Top Stories feature


The mobile Top Stories feature is a premier fresh content experience in Search that currently emphasizes AMP results, which have been optimized to exhibit a good page experience. Over the past several years, Top Stories has inspired new thinking about the ways we could promote better page experiences across the web.

When we roll out the page experience ranking update, we will also update the eligibility criteria for the Top Stories experience. AMP will no longer be necessary for stories to be featured in Top Stories on mobile; it will be open to any page. Alongside this change, page experience will become a ranking factor in Top Stories, in addition to the many factors assessed. As before, pages must meet the Google News content policies to be eligible. Site owners who currently publish pages as AMP, or with an AMP version, will see no change in behavior – the AMP version will be what’s linked from Top Stories. 

Summary


We believe user engagement will improve as experiences on the web get better -- and that by incorporating these new signals into Search, we'll help make the web better for everyone. We hope that sharing our roadmap for the page experience updates and launching supporting tools ahead of time will help the diverse ecosystem of web creators, developers, and businesses to improve and deliver more delightful user experiences. 

Please stay tuned for our future updates that will communicate more specific guidance on the timing for these changes to come into effect. As always, if you have any questions or feedback, please visit our webmaster forums.