Category Archives: Google Webmaster Central Blog

Official news on crawling and indexing sites for the Google index

Monitoring structured data with Search Console

In our previous post in the structured data series, we discussed what structured data is and why you should add it to your site. We are committed to structured data and continue to enhance related Search features and improve our tools - that’s why we have been creating solutions to help webmasters and developers implement and diagnose structured data.

This post focuses on what you can do with Search Console to monitor and make the most out of structured data for your site. In addition, we have some new features that will help you even more. Below are the new additions, read on to learn more about them.
  1. Unparsable structured data is a new report that aggregates structured data syntax errors.
  2. New enhancement reports for Sitelinks searchbox and Logo.

Monitoring overall structured data performance

Every time Search Console detects a new issue related to structured data on a website, we send an email to account owners - but if an existing issue gets worse, it won’t trigger an email, so it is still important for you to check your account sporadically.

This is not something you need to do every day, but we recommend you check it once in a while to make sure everything is working as intended. If your website development has defined cycles, it might be a good practice to log in to Search Console after changes are made to the website to monitor your performance.

If you’d like to have an overall idea of all the errors for a specific structured data feature in your site, you can navigate to the Enhancements menu in the left sidebar and click  a feature. You'll find a summary of all errors and warnings, as well as the valid items.

As mentioned above, we added a new set of reports to help you understand more types of structured data on your site: Sitelinks searchbox and Logo. They are joining the existing set of reports on Recipe, Event, Job Posting and others. You can read more about the reports in the Search Console Help Center.

Here's an example of an Enhancement report, note that you can only see enhancements that have been detected in your pages. The report helps you with the following actions:
  • Review the trends of errors, warnings and valid items: To view each status issue separately, click the colored boxes above the bar chart.
  • Review warnings and errors per page: To see examples of pages which are currently affected by the issues, click a specific row below the bar chart.
Image: Enhancements report
We are also happy to launch the Unparsable Structured Data report, which aggregates parsing issues such as structured data syntax errors that prevented Google from identifying the feature type. That is the reason these issues are aggregated here instead of the intended specific feature report.

Check this report to see if Google was unable to parse any of the structured data you tried to add to your site. Parsing issues could point you to lost opportunities for rich results for your site. Below is a screenshot showing how the report looks like. You can access the report directly and read more about the report in our help center.
Image: Unparsable Structured Data report

Testing structured data on a URL level

To make sure your pages were processed correctly and are eligible for rich results or as a way to diagnose why some rich result are not surfacing for a specific URL, you can use the URL Inspection tool. This tool helps you understand areas of improvement at a URL level and helps you get an idea on where to focus.

When you paste a URL into the search box at the top of Search Console, you can find what’s working properly and warnings or errors related to your structured data in the enhancements section, as seen below for Recipes.
Image: URL Inspection tool
In the screenshot above, there is an error related to Recipes. If you click Recipes, information about the error displays, and you can click the little chart icon to the right of the error to learn more about it.

Once you understand and fix the error, you can click Validate Fix (see screenshot below) so Google can start validating whether the issue is indeed fixed. When you click the Validate Fix button, Google runs several instantaneous tests. If your pages don’t pass this test, Search Console provides you with an immediate notification. Otherwise, Search Console reprocesses the rest of the affected pages.
Image: Structured data error detail
We would love to hear your feedback on how Search Console has helped you and how it can help you even more with structured data. Send us feedback through Twitter or the Webmaster forum.

Posted by Daniel Waisberg, Search Advocate & Na'ama Zohary, Search Console team 

Enriching Search Results Through Structured Data

For many years we have been recommending the use of structured data on websites to enable a richer search experience. When you add markup to your content, you help search engines understand the different components of a page. When Google's systems understand your page more clearly, Google Search can surface content through the cool features discussed in this post, which can enhance the user experience and get you more traffic.

We've worked hard to provide you with tools to understand how your websites are shown in Google Search results and whether there are issues you can fix. To help give a complete overview of structured data, we decided to do a series to explore it. This post provides a quick intro and discusses some best practices, future posts will focus on how to use Search Console to succeed with structured data.

What is structured data?

Structured data is a common way of providing information about a page and its content - we recommend using the schema.org vocabulary for doing so. Google supports three different formats of in-page markup: JSON-LD (recommended), Microdata, and RDFa. Different search features require different kinds of structured data - you can learn more about these in our search gallery. Our developer documentation has more details on the basics of structured data.

Structured data helps Google's systems understand your content more accurately, which means it’s better for users as they will get more relevant results. If you implement structured data your pages may become eligible to be shown with an enhanced appearance in Google search results.

Disclaimer: Google does not guarantee that your structured data will show up in search results, even if your page is marked up correctly. Using structured data enables a feature to be present, it does not guarantee that it will be present. Learn more about structured data guidelines.

Sites that use structured data see results

Over the years, we've seen a growing adoption of structured data in the ecosystem. In general, rich results help users to better understand how your pages are relevant to their searches, so they translate into success for websites. Here are some results that are showcased in our case studies gallery:
  • Eventbrite leveraged event structured data and saw 100% increase in the typical YOY growth of traffic from search.
  • Jobrapido integrated with the job experience on Google Search and saw 115% increase in organic traffic, 270% increase of new user registrations from organic traffic, and 15% lower bounce rate for Google visitors to job pages.
  • Rakuten used the recipe search experience and saw a 2.7X increase in traffic from search engines and a 1.5X increase in session duration.

How to use structured data?

There are a few ways your site could benefit from structured data. Below we discuss some examples grouped by different types of goals: increase brand awareness, highlight content, and highlight product information.

1. Increase brand awareness

One thing you can do to promote your brand with structured data is to take advantage of features such as Logo, Local business, and Sitelinks searchbox. In addition to adding structured data, you should verify your site for the Knowledge Panel and claim your business on Google My Business. Here is an example of the knowledge panel with a Logo.


2. Highlight content

If you publish content on the web, there are a number of features that can help promote your content and attract more users, depending on your industry. For example: Article, Breadcrumb, Event, Job, Q&A, Recipe, Review and others. Here is an example of a recipe rich result.



3. Highlight product information

If you sell merchandise, you could add product structured data to your page, including price, availability, and review ratings. Here is how your product might show for a relevant search.


Try it and let us know

Now that you understand the importance of structured data, try our codelab to learn how to add it to your pages. Stay tuned to learn more about structured data, in the coming posts we’ll be discussing how to use Search Console to better analyze your efforts.

We would love to hear your thoughts and stories on how structured data works for you, send us any feedback either through Twitter or our forum.

Posted by Daniel Waisberg, Search Advocate

Instant-loading AMP pages from your own domain

Today we are rolling out support in Google Search’s AMP web results (also known as “blue links”) to link to signed exchanges, an emerging new feature of the web enabled by the IETF web packaging specification. Signed exchanges enable displaying the publisher’s domain when content is instantly loaded via Google Search. This is available in browsers that support the necessary web platform feature—as of the time of writing, Google Chrome—and availability will expand to include other browsers as they gain support (e.g. the upcoming version of Microsoft Edge).

Background on AMP’s instant loading

One of AMP's biggest user benefits has been the unique ability to instantly load AMP web pages that users click on in Google Search. Near-instant loading works by requesting content ahead of time, balancing the likelihood of a user clicking on a result with device and network constraints–and doing it in a privacy-sensitive way.

We believe that privacy-preserving instant loading web content is a transformative user experience, but in order to accomplish this, we had to make trade-offs; namely, the URLs displayed in browser address bars begin with google.com/amp, as a consequence of being shown in the Google AMP Viewer, rather than display the domain of the publisher. We heard both user and publisher feedback over this, and last year we identified a web platform innovation that provides a solution that shows the content’s original URL while still retaining AMP's instant loading.

Introducing signed exchanges

A signed exchange is a file format, defined in the web packaging specification, that allows the browser to trust a document as if it belongs to your origin. This allows you to use first-party cookies and storage to customize content and simplify analytics integration. Your page appears under your URL instead of the google.com/amp URL.

Google Search links to signed exchanges when the publisher, browser, and the Search experience context all support it. As a publisher, you will need to publish both the signed exchange version of the content in addition to the non-signed exchange version. Learn more about how Google Search supports signed exchange.

Getting started with signed exchanges

Many publishers have already begun to publish signed exchanges since the developer preview opened up last fall. To implement signed exchanges in your own serving infrastructure, follow the guide “Serve AMP using Signed Exchanges” available at amp.dev.

If you use a CDN provider, ask them if they can provide AMP signed exchanges. Cloudflare has recently announced that it is offering signed exchanges to all of its customers free of charge.

Check out our resources like the webmaster community or get in touch with members of the AMP Project with any questions. You can also provide feedback on the signed exchange specification.


Search Console reporting for your site’s Discover performance data

Discover is a popular way for users to stay up-to-date on all their favorite topics, even when they’re not searching. To provide publishers and sites visibility into their Discover traffic, we're adding a new report in Google Search Console to share relevant statistics and help answer questions such as:

  • How often is my site shown in users' Discover? How large is my traffic?
  • Which pieces of content perform well in Discover?
  • How does my content perform differently in Discover compared to traditional search results?

A quick reminder: What is Discover?

Discover is a feature within Google Search that helps users stay up-to-date on all their favorite topics, without needing a query. Users get to their Discover experience in the Google app, on the Google.com mobile homepage, and by swiping right from the homescreen on Pixel phones. It has grown significantly since launching in 2017 and now helps more than 800M monthly active users get inspired and explore new information by surfacing articles, videos, and other content on topics they care most about. Users have the ability to follow topics directly or let Google know if they’d like to see more or less of a specific topic. In addition, Discover isn’t limited to what’s new. It surfaces the best of the web regardless of publication date, from recipes and human interest stories, to fashion videos and more. Here is our guide on how you can optimize your site for Discover.

Discover in Search Console

The new Discover report is shown to websites that have accumulated meaningful visibility in Discover, with the data shown back to March 2019. We hope this report is helpful in thinking about how you might optimize your content strategy to help users discover engaging information-- both new and evergreen.

For questions or comments on the report, feel free to drop by our webmaster help forums, or contact us through our other channels.

User experience improvements with page speed in mobile search

To help users find the answers to their questions faster, we included page speed as a ranking factor for mobile searches in 2018. Since then, we've observed improvements on many pages across the web. We want to recognize the performance improvements webmasters have made over the past year. A few highlights:

  • For the slowest one-third of traffic, we saw user-centric performance metrics improve by 15% to 20% in 2018. As a comparison, no improvement was seen in 2017.
  • We observed improvements across the whole web ecosystem. On a per country basis, more than 95% of countries had improved speeds.
  • When a page is slow to load, users are more likely to abandon the navigation. Thanks to these speed improvements, we've observed a 20% reduction in abandonment rate for navigations initiated from Search, a metric that site owners can now also measure via the Network Error Logging API available in Chrome.
  • In 2018, developers ran over a billion PageSpeed Insights audits to identify performance optimization opportunities for over 200 million unique urls.

Great work and thank you! We encourage all webmasters to optimize their sites’ user experience. If you're unsure how your pages are performing, the following tools and documents can be useful:

  1. PageSpeed Insights provides page analysis and optimization recommendations.
  2. Google Chrome User Experience Report provides the user experience metrics for how real-world Chrome users experience popular destinations on the web.
  3. Documentation on performance on Web Fundamentals.

For any questions, feel free to drop by our help forums (like the webmaster community) to chat with other experts.


How to discover & suggest Google-selected canonical URLs for your pages

Sometimes a web page can be reached by using more than one URL. In such cases, Google tries to determine the best URL to display in search and to use in other ways. We call this the “canonical URL.” There are ways site owners can help us better determine what should be the canonical URLs for their content.

If you suspect we’ve not selected the best canonical URL for your content, you can check by entering your page’s address into the URL Inspection tool within Search Console. It will show you the Google-selected canonical. If you believe there’s a better canonical that should be used, follow the steps on our duplicate URLs help page on how to suggest a preferred choice for consideration.

Please be aware that if you search using the site: or inurl: commands, you will be shown the domain you specified in those, even if these aren’t the Google-selected canonical. This happens because we’re fulfilling the exact request entered. Behind-the-scenes, we still use the Google-selected canonical, including for when people see pages without using the site: or inurl: commands.

We’ve also changed URL Inspection tool so that it will display any Google-selected canonical for a URL, not just those for properties you manage in Search Console. With this change, we’re also retiring the info: command. This was an alternative way of discovering canonicals. It was relatively underused, and URL Inspection tool provides a more comprehensive solution to help publishers with URLs.


This year in Search Spam – Webspam report 2018

Google aims to provide the highest quality results for any search. As part of this, we take action to prevent what we call “webspam” from degrading the search experience, content and behaviors that violate our webmaster guidelines. Our efforts help ensure that well under 1 percent of results visited by users are for spammy pages. Here’s more about how we fought webspam in 2018.



Google webspam trends and how we fought webspam in 2018



Of the types of spam we fought in 2018, three continue to stand out:


Spam on hacked websites: We reported in 2017 that we had seen a substantial reduction of spam from hacked websites in search results. This trend continued in 2018, with faster discovery of hacked web pages before they affect search results or put someone in harm’s way.   While we reduced how spam on hacked sites affects search, hacked websites remain a major security problem affecting the safety of the web. Even though we can’t prevent a website hack from happening, we’re committed to helping webmasters whose websites have been compromised by offering resources to help them recover from a hacked website. 


User-generated spam: A particular type of spam known as User-generated spam has been a continued focus for us. User-generated spam includes spammy posts on forums, as well as spammy accounts on free blogs and platforms, none of which are meant to be consumed by human beings, and all of which disrupt conversations while adding no value to users. In 2018, we were able to reduce the impact on search users from this type of spam by more than 80%. While we can’t prevent websites from being exploited, we do want to make it easier for website owners to learn how to protect themselves, which is why we provide resources on how to prevent abuse of your site’s public areas.


Link spam: We continued to protect the value of authoritative and relevant links as an important ranking signal for Search. We continued to deal swiftly with egregious link spam, and made a number of bad linking practices less effective for manipulating ranking. Above all, we continued to engage with webmasters and SEOs to chip away at the many myths that have emerged over the years relating to linking practices. We continued to remind website owners that if you simply stay away from building links mainly as an attempt to rank better and focus on creating great content, you should not have to worry about any of the myths or realities. We think that one of the best ways of fighting spam of all types is by encouraging website owners to just create great quality content. Resources such as the SEO starter-guide highlight best practices and bust some common myths and misconceptions related to what it takes to appear well in Google Search results. Reporting link spam is also a great way to assist us in fighting against this type of abuse and to help preserve fairness in Search ranking.



Working with users, webmasters and developers for a better web

Everyday users continue to help us find spam, malware and other issues in Search that escape our filters and processes by reporting spam on search, reporting phishing or  reporting malware. We received over 180,000 search spam user reports and we were able to take action on 64% of the reports we processed. These reports truly make a difference and we’d like to thank all of you who submitted them. 


We think it’s important to let website owners known when we detect something wrong with their website. In 2018, we generated over 186 million messages to website owners calling out potential improvements, issues and problems that could affect their site’s appearance on Search results. We can only deliver these notifications to site owners that verified their sites in Search Console, and we successfully delivered 96 million of those messages. The rest of the messages will be kept linked with the website for as long as they are relevant, so they can be seen when a webmaster successfully registers their site in Search Console. The majority of these messages were welcoming new users to Search Console, and the second largest group was informing registered Search Console users when Mobile-First Indexing became available. Of all messages, slightly over 2%—about 4 million—were related to manual actions resulting from violations of our Webmaster Guidelines. 


High quality content keeps spam off of search results, and we continued to improve the tools and reports we offer for webmasters that create that content. The Google Search Console was completely rebuilt from the ground up to provide both new and improved reports (Performance, Index Coverage, Links, Mobile Usability report), as well as brand new features (URL Inspection Tool and Site and User management). This improved Search Console graduated out of beta in 2018 and is now available generally to all registered website owners.


We didn’t forget the front-end developers who make the modern web work, and focused on helping them make their sites great for users and also search-friendly regardless of whether they are on a CMS, roll their own CSS and JS, or build on top of a web framework. With the new SEO audit capability in Lighthouse, the open-source and automated auditing tool for improving the quality of web pages, developers and webmasters can now run actionable SEO health-checks on their pages and quickly identify areas for improvement.


We also engage directly with website owners to provide help with thorny issues. Our dedicated team members meet with webmasters around the world regularly, both online and in-person. We delivered more than 190 online office hours, online events and offline events in more than 76 cities, to audiences totaling over 170,000 including SEOs, developers and online marketers. We hosted four search events in Tokyo, Singapore, Zurich and Osaka as well as an 11-city Search Conference in India. In 2018, we started live office hours in Spanish on top of English, French, German, Hindi and Japanese, where Webmasters can find help, tips and useful discussion on our Google Webmaster YouTube channel. Product experts continued to help webmasters find solutions through our official support forums in over a dozen languages. 


We look forward to continuing our work to deliver a spam-free Search experience to all in 2019!


Posted by Juan Felipe Rincón, Webmaster Outreach, Dublin

Help Google Search know the best date for your web page

Sometimes, Google shows dates next to listings in its search results. In this post, we’ll answer some commonly-asked questions webmasters have about how these dates are determined and provide some best practices to help improve their accuracy.

How dates are determined

Google shows the date of a page when its automated systems determine that it would be relevant to do so, such as for pages that can be time-sensitive, including news content:

Google determines a date using a variety of factors, including but not limited to: any prominent date listed on the page itself or dates provided by the publisher through structured markup.

Google doesn’t depend on one single factor because all of them can be prone to issues. Publishers may not always provide a clear visible date. Sometimes, structured data may be lacking or may not be adjusted to the correct time zone. That’s why our systems look at several factors to come up with what we consider to be our best estimate of when a page was published or significantly updated.

How to specify a date on a page

To help Google to pick the right date, site owners and publishers should:

  • Show a clear date: Show a visible date prominently on the page.
  • Use structured data: Use the datePublished and dateModified schema with the correct time zone designator for AMP or non-AMP pages. When using structured data, make sure to use the ISO 8601 format for dates.

Guidelines specific to Google News

Google News requires clearly showing both the date and the time that content was published or updated. Structured data alone is not enough, though it is recommended to use in addition to a visible date and time. Date and time should be positioned between the headline and the article text. For more guidance, also see our help page about article dates.

If an article has been substantially changed, it can make sense to give it a fresh date and time. However, don't artificially freshen a story without adding significant information or some other compelling reason for the freshening. Also, do not create a very slightly updated story from one previously published, then delete the old story and redirect to the new one. That's against our article URLs guidelines.

More best practices for dates on web pages

In addition to the most important requirements listed above, here are additional best practices to help Google determine the best page to consider showing for a web page:

  • Show when a page has been updated: If you update a page significantly, also update the visible date (and time, if you display that). If desired, you can show two dates: when a page was originally published and when it was updated. Just do so in a way that’s visually clear to your readers. If showing both dates, it’s also highly recommended to use datePublished and dateModified for AMP or non-AMP pages to make it easier for algorithms to recognize.
  • Use the right time zone: If specifying a time, make sure to provide the correct timezone, taking into account daylight saving time as appropriate.
  • Be consistent in usage. Within a page, make sure to use exactly the same date (and, potentially, time) in structured data as well as in the visible part of the page. Make sure to use the same timezone if you specify one on the page.
  • Don’t use future dates or dates related to what a page is about: Always use a date for when a page itself was published or updated, not a date linked to something like an event that the page is writing about, especially for events or other subjects that happen in the future (you may use Event markup separately, if appropriate).
  • Follow Google's structured data guidelines: While Google doesn't guarantee that a date (or structured data in general) specified on a page will be used, following our structured data guidelines does help our algorithms to have it available in a machine-readable way.
  • Troubleshoot by minimizing other dates on the page: If you’ve followed the best practices above and find incorrect dates are being selected, consider if you can remove or minimize other dates that may appear on the page, such as those that might be next to related stories.

We hope these guidelines help to make it easier to specify the right date on your website's pages! For questions or comments on this, or other structured data topics, feel free to drop by our webmaster help forums.


Introducing a new JavaScript SEO video series

We made a new video series on JavaScript SEO that benefits both web developers and SEOs. In the series we want to help making discoverable web apps with JavaScript.


JavaScript is popular as it allows to build more engaging web applications. JavaScript frameworks are widely used as they:

  • Improve developer productivity by providing useful utilities and tooling
  • Allow faster prototyping cycles thanks to their ecosystems of components and libraries
  • Help structuring the code even in larger application codebases
JavaScript also brings a few new considerations and challenges to SEO. Some of the considerations are strategic and some are more technical. In the video series, we'll cover:
  • The difference between classic and JavaScript sites
  • How Google Search crawls, renders, and indexes JavaScript content
  • SEO fundamentals for React, Angular, and Vue
  • Tools to test and debug a JavaScript site
  • What dynamic rendering is and how to set it up with Rendertron
Check out the JavaScript SEO YouTube playlist and subscribe to the Google Webmasters channel to get the weekly episodes when they go online. We are looking forward to your feedback and are all ears for your input on further episodes. You can reach us through the Webmaster Forum, the Google Webmasters Twitter account or in the YouTube comments under the videos.

Announcing domain-wide data in Search Console

Google recommends verifying all versions of a website -- http, https, www, and non-www -- in order to get the most comprehensive view of your site in Google Search Console. Unfortunately, many separate listings can make it hard for webmasters to understand the full picture of how Google “sees” their domain as a whole. To make this easier, today we're announcing "domain properties" in Search Console, a way of verifying and seeing the data from Google Search for a whole domain.

Domain properties show data for all URLs under the domain name, including all protocols, subdomains, and paths. They give you a complete view of your website across Search Console, reducing the need to manually combine data. So regardless of whether you use m-dot URLs for mobile pages, or are (finally) getting the migration to HTTPS set up, Search Console will be able to help with a complete view of your site's data with regards to how Google Search sees it.

If you already have DNS verification set up, Search Console will automatically create new domain properties for you over the next few weeks, with data over all reports. Otherwise, to add a new domain property, go to the property selector, add a new domain property, and use DNS verification.We recommend using domain properties where possible going forward.

Domain properties were built based on your feedback; thank you again for everything you've sent our way over the years! We hope this makes it easier to manage your site, and to get a complete overview without having to manually combine data. Should you have any questions, feel free to drop by our help forums, or leave us a comment on Twitter. And as always, you can also use the feedback feature built in to Search Console as well.