Tag Archives: feedback and communication

Google Search News: coming soon to a screen near you

The world of search is constantly evolving. New tools, opportunities, and features are regularly arriving, sometimes existing things change, and sometimes we say goodbye to some things to make way for the new. To help you stay on top of things, we've started a new YouTube series called Google Search News.

With Google Search News, we want to give you a regular & short summary of what's been happening around Google Search, specifically for SEOs, publishers, developers, and webmasters. The first episode is out now, so check it out. 

(The first episode, now on your screen)

In this first episode, we cover:

We plan to make these updates regularly, and adjust the format over time as needed. Let us know what you think in the video comments!

When indexing goes wrong: how Google Search recovered from indexing issues & lessons learned since.

Most of the time, our search engine runs properly. Our teams work hard to prevent technical issues that could affect our users who are searching the web, or webmasters whose sites we index and serve to users. Similarly, the underlying systems that we use to power the search engine also run as intended most of the time. When small disruptions happen, they are largely not visible to anyone except our teams who ensure that our products are up and running. However, like all complex systems, sometimes larger outages can occur, which may lead to disruptions for both users and website creators.

In the last few months, such a situation occurred with our indexing systems, which had a ripple effect on some other parts of our infrastructure. While we worked as quickly as possible to remedy the situation, we apologize for the disruption, as our goal is to continuously provide high-quality products to our users and to the web ecosystem.

Since then, we took a closer, careful look into the situation. In the process, we learned a few lessons that we'd like to share with you today. In this blog post, we will go into more details about what happened, clarify how we plan to communicate better if such things happen in the future, and remind website owners of the channels they can use to communicate with us.

So, what happened a few months ago?

In April, we had several issues related to our index. The Search index is the database that holds the hundreds of billions of web pages that we crawled on the web and that we think could answer some of our users’ queries. When a user enters a query in the Google search engine, our ranking algorithms sort through those pages in our Search index to find the most relevant, useful results in a fraction of a second. Here is more information on what happened.

1. The indexing issue

To start it off, we temporarily lost part of the Search index.
Wait... What? What do you mean “lost part of the index?” Is that even possible?

Basically, when serving search results to users, to accelerate the speed of the service, the query of the user only “travels” as far as the closest of our data centers supporting the Google Search product, from which the Search Engine Results Page (SERP) is generated. So when there are modifications to the composition of the index (some pages added and removed, documents are merged, or other types of data modification), those modifications need to be reflected in all of those data centers. The consequence is that users all over the world are consistently served pages from the most recent version of the index.


Google owns and operates data centers (like the one pictured above) around the world, to keep our products running 24 hours a day, 7 days a week - source

Keeping the index unified across all those data centers is a non trivial task. For large user-facing services, we may deploy updates by starting in one data center and expand until all relevant data centers are updated. For sensitive pieces of infrastructure, we may extend a rollout over several days, interleaving them across instances in different geographic regions. source

So, as we pushed some planned changes to the Search index, on April 5th parts of the deployment system broke, on a Friday no-less! More specifically: as we were updating the index over some of our data centers, a small number of documents ended up being dropped from the index accidentally. Hence: “we lost part of the index.”

Luckily, our on-call engineers caught the issue pretty quickly, at the same time as we started picking up chatter on social media (thanks to everyone who notified us over that weekend!). As a result, we were able to start reverting the Search index to its previous stable state in all data centers only a few hours after the issue was uncovered (we keep back-ups of our indexes just in case such events happen).

We communicated on Sunday, April 7th that we were aware of the issue, and that things were starting to get back to normal. As data centers were progressively reverting back to a stable index, we continued updating on Twitter (on April 8th, on April 9th), until we were confident that all data centers were fully back to a complete version of the index on April 11th.

2. The Search Console issue

Search Console is the set of tools and reports any webmaster can use to access data about their website’s performance in Search. For example, it shows how many impressions and clicks a website gets in the organic search results every day, or information on what pages of a website are included and excluded from the Search index.

As a consequence of the Search index having the issues we described above, Search Console started to also show inconsistencies. This is because some of the data that surfaces in Search Console originates from the Search index itself:

  • the Index Coverage report depends on the Search index being consistent across data centers.
  • when we store a page in the Search index, we can annotate the entry with key signals about the page, like the fact that the page contains rich results markup for example. Therefore, an issue with the Search index can have an impact on the Rich Results reports in Search Console.

Basically, many Search Console individual reports read data from a dedicated database. That database is partially built by using information that comes from the Search index. As we had to revert back to a previous version of the Search index, we also had to pause the updating of the Search Console database. This resulted in plateau-ing data for some reports (and flakiness in others, like the URL inspection tool).


Index coverage report for indexed pages, which shows an example of the data freshness issues in Search Console in April 2019, with a longer time between 2 updates than what is usually observed.

Because the whole Search index issue took several days to roll back (see explanation above), we were delayed focusing on fixing the Search Console database until a few days later, only after the indexing issues were fixed. We communicated on April 15th - tweet - that the Search Console was having troubles and that we were working on fixing it, and we completed our fixes on April 28th (day on which the reports started gathering fresh data again, see graph above). We communicated on Twitter on April 30th that the issue was resolved- tweet.

3. Other issues unrelated to the main indexing bug

Google Search relies on a number of systems that work together. While some of those systems can be tightly linked to one another, in some cases different parts of the system experience unrelated problems around the same time.

In the present case for example, around the same time as the main indexing bug explained above, we also had brief problems gathering fresh Google News content. Additionally, while rendering pages, certain URLs started to redirect Googlebot to other unrelated pages. These issues were entirely unrelated to the indexing bug, and were quickly resolved (tweet 1 & tweet 2).

Our communication and how we intend on doing better

In addition to communicating on social media (as highlighted above) during those few weeks, we also gave webmasters more details in 2 other channels: Search Console, as well as the Search Console Help Center.

In the Search Console Help Center

We updated our “Data anomalies in Search Console” help page after the issue was fully identified. This page is used to communicate information about data disruptions to our Search Console service when the impact affects a large number of website owners.

In Search Console

Because we know that not all our users read social media or the external Help Center page, we also added annotations on Search Console reports, to notify users that the data might not be accurate (see image below). We added this information after the resolution of the bugs. Clicking on “see here for more details” sends users to the “Data Anomalies” page in the Help Center.


Index coverage report for indexed pages, which shows an example of the data annotations that we can include to notify users of specific issues.

Communications going forward

When things break at Google, we have a strong “postmortem” culture: creating a document to debrief on the breakage, and try to avoid it happening next time. The whole process is described in more detail at the Google Site Reliability Engineering website.

In the wake of the April indexing issues, we included in the postmortem how to better communicate with webmasters in case of large system failures. Our key decisions were:

  1. Explore ways to more quickly share information within Search Console itself about widespread bugs, and have that information serve as the main point of reference for webmasters to check, in case they are suspecting outages.
  2. More promptly post to the Search Console data anomalies page, when relevant (if the disturbance is going to be seen over the long term in Search Console data).
  3. Continue tweeting as quickly as we can about such issues to quickly reassure webmasters we’re aware and that the issue is on our end.

Those commitments should make potential future similar situations more transparent for webmasters as a whole.

Putting our resolutions into action: the “new URLs not indexed” case study

On May 22nd, we tested our new communications strategy, as we experienced another issue. Here’s what happened: while processing certain URLs, our duplicate management system ran out of memory after a planned infrastructure upgrade, which caused all incoming URLs to stop processing.

Here is a timeline of how we thought about communications, following the 3 points highlighted just above:

  1. We noticed the issue (around 5.30am California time, May 22nd)
    We tweeted about the ongoing issue (around 6.40am California time, May 22nd)
    We tweeted about the resolution (around 10pm California time, May 22nd)
  2. We evaluated updating the “Data Anomalies” page in the Help Center, but decided against it since we did not expect any long-term impact for the majority of webmasters' Search Console data in the long run.
  3. The confusion that this issue created for many confirmed our earlier conclusions that we need a way to signal more clearly in the Search Console itself that there might be a disruption to one of our systems which could impact webmasters. Such a solution might take longer to implement. We will communicate on this topic in the future, as we have more news.

Last week, we also had another indexing issue. As with May 22, we tweeted to let people know there was an issue, that we were working to fix it and when the issue was resolved.

How to debug and communicate with us

We hope that this post will bring more clarity to how our systems are complex and can sometimes break, and will also help you understand how we communicate about these matters. But while this post focuses on a widespread breakage of our systems, it’s important to keep in mind that most website indexing issues are caused by an individual website’s configuration, which can create difficulties for Google Search to index that website properly. For those cases, all webmasters can debug issues using Search Console and our Help center. After doing so, if you still think that an issue is not coming from your site or don’t know how to resolve it, come talk to us and our community, we always want to take feedback from our users. Here is how to signal an issue to us:

  • Check our Webmaster Community, sometimes other webmasters have highlighted an issue that also impacts your site.
  • In person! We love contact, come and talk to us at events. Calendar.
  • Within our products! The Search Console feedback tool is very useful to our teams.
  • Twitter and YouTube!

This year in Search Spam – Webspam report 2018

Google aims to provide the highest quality results for any search. As part of this, we take action to prevent what we call “webspam” from degrading the search experience, content and behaviors that violate our webmaster guidelines. Our efforts help ensure that well under 1 percent of results visited by users are for spammy pages. Here’s more about how we fought webspam in 2018.



Google webspam trends and how we fought webspam in 2018



Of the types of spam we fought in 2018, three continue to stand out:


Spam on hacked websites: We reported in 2017 that we had seen a substantial reduction of spam from hacked websites in search results. This trend continued in 2018, with faster discovery of hacked web pages before they affect search results or put someone in harm’s way.   While we reduced how spam on hacked sites affects search, hacked websites remain a major security problem affecting the safety of the web. Even though we can’t prevent a website hack from happening, we’re committed to helping webmasters whose websites have been compromised by offering resources to help them recover from a hacked website. 


User-generated spam: A particular type of spam known as User-generated spam has been a continued focus for us. User-generated spam includes spammy posts on forums, as well as spammy accounts on free blogs and platforms, none of which are meant to be consumed by human beings, and all of which disrupt conversations while adding no value to users. In 2018, we were able to reduce the impact on search users from this type of spam by more than 80%. While we can’t prevent websites from being exploited, we do want to make it easier for website owners to learn how to protect themselves, which is why we provide resources on how to prevent abuse of your site’s public areas.


Link spam: We continued to protect the value of authoritative and relevant links as an important ranking signal for Search. We continued to deal swiftly with egregious link spam, and made a number of bad linking practices less effective for manipulating ranking. Above all, we continued to engage with webmasters and SEOs to chip away at the many myths that have emerged over the years relating to linking practices. We continued to remind website owners that if you simply stay away from building links mainly as an attempt to rank better and focus on creating great content, you should not have to worry about any of the myths or realities. We think that one of the best ways of fighting spam of all types is by encouraging website owners to just create great quality content. Resources such as the SEO starter-guide highlight best practices and bust some common myths and misconceptions related to what it takes to appear well in Google Search results. Reporting link spam is also a great way to assist us in fighting against this type of abuse and to help preserve fairness in Search ranking.



Working with users, webmasters and developers for a better web

Everyday users continue to help us find spam, malware and other issues in Search that escape our filters and processes by reporting spam on search, reporting phishing or  reporting malware. We received over 180,000 search spam user reports and we were able to take action on 64% of the reports we processed. These reports truly make a difference and we’d like to thank all of you who submitted them. 


We think it’s important to let website owners known when we detect something wrong with their website. In 2018, we generated over 186 million messages to website owners calling out potential improvements, issues and problems that could affect their site’s appearance on Search results. We can only deliver these notifications to site owners that verified their sites in Search Console, and we successfully delivered 96 million of those messages. The rest of the messages will be kept linked with the website for as long as they are relevant, so they can be seen when a webmaster successfully registers their site in Search Console. The majority of these messages were welcoming new users to Search Console, and the second largest group was informing registered Search Console users when Mobile-First Indexing became available. Of all messages, slightly over 2%—about 4 million—were related to manual actions resulting from violations of our Webmaster Guidelines. 


High quality content keeps spam off of search results, and we continued to improve the tools and reports we offer for webmasters that create that content. The Google Search Console was completely rebuilt from the ground up to provide both new and improved reports (Performance, Index Coverage, Links, Mobile Usability report), as well as brand new features (URL Inspection Tool and Site and User management). This improved Search Console graduated out of beta in 2018 and is now available generally to all registered website owners.


We didn’t forget the front-end developers who make the modern web work, and focused on helping them make their sites great for users and also search-friendly regardless of whether they are on a CMS, roll their own CSS and JS, or build on top of a web framework. With the new SEO audit capability in Lighthouse, the open-source and automated auditing tool for improving the quality of web pages, developers and webmasters can now run actionable SEO health-checks on their pages and quickly identify areas for improvement.


We also engage directly with website owners to provide help with thorny issues. Our dedicated team members meet with webmasters around the world regularly, both online and in-person. We delivered more than 190 online office hours, online events and offline events in more than 76 cities, to audiences totaling over 170,000 including SEOs, developers and online marketers. We hosted four search events in Tokyo, Singapore, Zurich and Osaka as well as an 11-city Search Conference in India. In 2018, we started live office hours in Spanish on top of English, French, German, Hindi and Japanese, where Webmasters can find help, tips and useful discussion on our Google Webmaster YouTube channel. Product experts continued to help webmasters find solutions through our official support forums in over a dozen languages. 


We look forward to continuing our work to deliver a spam-free Search experience to all in 2019!


Posted by Juan Felipe Rincón, Webmaster Outreach, Dublin

How we fought webspam – Webspam Report 2017





We always want to make sure that when you use Google Search to find information, you get the highest quality results. But, we are aware of many bad actors who are trying to manipulate search ranking and profit from it, which is at odds with our core mission: to organize the world's information and make it universally accessible and useful. Over the years, we've devoted a huge effort toward combating abuse and spam on Search. Here's a look at how we fought abuse in 2017.


We call these various types of abuse that violate the webmaster guidelines “spam.” Our evaluation indicated that for many years, less than 1 percent of search results users visited are spammy. In the last couple of years, we’ve managed to further reduce this by half.



Google webspam trends and how we fought webspam in 2017



As we continued to improve, spammers also evolved. One of the trends in 2017 was an increase in website hacking—both for spamming search ranking and for spreading malware. Hacked websites are serious threats to users because hackers can take complete control of a site, deface homepages, erase relevant content, or insert malware and harmful code. They may also record keystrokes, stealing login credentials for online banking or financial transactions. In 2017 we focused on reducing this threat, and were able to detect and remove from search results more than 80 percent of these sites. But hacking is not just a spam problem for search users—it affects the owners of websites as well. To help website owners keep their websites safe, we created a hands-on resource to help webmasters strengthen their websites’ security and revamped our help resources to help webmasters recover from a hacked website. The guides are available in 19 languages.

We’re also recognizing the importance of robust content management systems (CMSs). A large percentage of websites are run on one of several popular CMSs, and subsequently spammers exploited them by finding ways to abuse their provisions for user-generated content, such as posting spam content in comment sections or forums. We’re working closely with many of the providers of popular content management systems like WordPress and Joomla to help them also fight spammers that abuse their forums, comment sections and websites.


Another abuse vector is the manipulation of links, which is one of the foundation ranking signals for Search. In 2017 we doubled down our effort in removing unnatural links via ranking improvements and scalable manual actions. We have observed a year-over-year reduction of spam links by almost half.


Working with users and webmasters for a better web



We’re here to listen: Our automated systems are constantly working to detect and block spam. Still, we always welcome hearing from you when something seems … phishy. Last year, we were able to take action on nearly 90,000 user reports of search spam.


Reporting spam, malware and other issues you find helps us protect the site owner and other searchers from this abuse. You can file a spam report, a phishing report or a malware report. We very much appreciate these reports—a big THANK YOU to all of you who submitted them.


We also actively work with webmasters to maintain the health of the web ecosystem. Last year, we sent 45 million messages to registered website owners via Search Console letting them know about issues we identified with their websites. More than 6 million of these messages are related to manual actions, providing transparency to webmasters so they understand why their sites got manual actions and how to resolve the issue.

Last year, we released a beta version of a new Search Console to a limited number of users and afterwards, to all users of Search Console. We listened to what matters most to the users, and started with popular functionalities such as Search performance, Index Coverage and others. These can help webmasters optimize their websites' Google Search presence more easily.

Through enhanced Safe Browsing protections, we continue to protect more users from bad actors online. In the last year, we have made significant improvements to our safe browsing protection, such as broadening our protection of macOS devices, enabling predictive phishing protection in Chrome, cracked down on mobile unwanted software, and launched significant improvements to our ability to protect users from deceptive Chrome extension installation.


We have a multitude of channels to engage directly with webmasters. We have dedicated team members who meet with webmasters regularly both online and in-person. We conducted more than 250 online office hours, online events and offline events around the world in more than 60 cities to audiences totaling over 220,000 website owners, webmasters and digital marketers. In addition, our official support forum has answered a high volume of questions in many languages. Last year, the forum had 63,000 threads generating over 280,000 contributing posts by 100+ Top Contributors globally. For more details, see this post. Apart from the forums, blogs and the SEO starter guide, the Google Webmaster YouTube channel is another channel to find more tips and insights. We launched a new SEO snippets video series to help with short and to-the-point answers to specific questions. Be sure to subscribe to the channel!


Despite all these improvements, we know we’re not yet done. We’re relentless in our pursue of an abuse-free user experience, and will keep improving our collaboration with the ecosystem to make it happen.



Posted by Cody Kwok, Principal Engineer

How we fought webspam – Webspam Report 2017





We always want to make sure that when you use Google Search to find information, you get the highest quality results. But, we are aware of many bad actors who are trying to manipulate search ranking and profit from it, which is at odds with our core mission: to organize the world's information and make it universally accessible and useful. Over the years, we've devoted a huge effort toward combating abuse and spam on Search. Here's a look at how we fought abuse in 2017.


We call these various types of abuse that violate the webmaster guidelines “spam.” Our evaluation indicated that for many years, less than 1 percent of search results users visited are spammy. In the last couple of years, we’ve managed to further reduce this by half.



Google webspam trends and how we fought webspam in 2017



As we continued to improve, spammers also evolved. One of the trends in 2017 was an increase in website hacking—both for spamming search ranking and for spreading malware. Hacked websites are serious threats to users because hackers can take complete control of a site, deface homepages, erase relevant content, or insert malware and harmful code. They may also record keystrokes, stealing login credentials for online banking or financial transactions. In 2017 we focused on reducing this threat, and were able to detect and remove from search results more than 80 percent of these sites. But hacking is not just a spam problem for search users—it affects the owners of websites as well. To help website owners keep their websites safe, we created a hands-on resource to help webmasters strengthen their websites’ security and revamped our help resources to help webmasters recover from a hacked website. The guides are available in 19 languages.

We’re also recognizing the importance of robust content management systems (CMSs). A large percentage of websites are run on one of several popular CMSs, and subsequently spammers exploited them by finding ways to abuse their provisions for user-generated content, such as posting spam content in comment sections or forums. We’re working closely with many of the providers of popular content management systems like WordPress and Joomla to help them also fight spammers that abuse their forums, comment sections and websites.


Another abuse vector is the manipulation of links, which is one of the foundation ranking signals for Search. In 2017 we doubled down our effort in removing unnatural links via ranking improvements and scalable manual actions. We have observed a year-over-year reduction of spam links by almost half.


Working with users and webmasters for a better web



We’re here to listen: Our automated systems are constantly working to detect and block spam. Still, we always welcome hearing from you when something seems … phishy. Last year, we were able to take action on nearly 90,000 user reports of search spam.


Reporting spam, malware and other issues you find helps us protect the site owner and other searchers from this abuse. You can file a spam report, a phishing report or a malware report. We very much appreciate these reports—a big THANK YOU to all of you who submitted them.


We also actively work with webmasters to maintain the health of the web ecosystem. Last year, we sent 45 million messages to registered website owners via Search Console letting them know about issues we identified with their websites. More than 6 million of these messages are related to manual actions, providing transparency to webmasters so they understand why their sites got manual actions and how to resolve the issue.

Last year, we released a beta version of a new Search Console to a limited number of users and afterwards, to all users of Search Console. We listened to what matters most to the users, and started with popular functionalities such as Search performance, Index Coverage and others. These can help webmasters optimize their websites' Google Search presence more easily.

Through enhanced Safe Browsing protections, we continue to protect more users from bad actors online. In the last year, we have made significant improvements to our safe browsing protection, such as broadening our protection of macOS devices, enabling predictive phishing protection in Chrome, cracked down on mobile unwanted software, and launched significant improvements to our ability to protect users from deceptive Chrome extension installation.


We have a multitude of channels to engage directly with webmasters. We have dedicated team members who meet with webmasters regularly both online and in-person. We conducted more than 250 online office hours, online events and offline events around the world in more than 60 cities to audiences totaling over 220,000 website owners, webmasters and digital marketers. In addition, our official support forum has answered a high volume of questions in many languages. Last year, the forum had 63,000 threads generating over 280,000 contributing posts by 100+ Top Contributors globally. For more details, see this post. Apart from the forums, blogs and the SEO starter guide, the Google Webmaster YouTube channel is another channel to find more tips and insights. We launched a new SEO snippets video series to help with short and to-the-point answers to specific questions. Be sure to subscribe to the channel!


Despite all these improvements, we know we’re not yet done. We’re relentless in our pursue of an abuse-free user experience, and will keep improving our collaboration with the ecosystem to make it happen.



Posted by Cody Kwok, Principal Engineer

How listening to our users helped us build a better Search Console

The new Search Console beta is up and running. We’ve been flexing our listening muscles and finding new ways to incorporate your feedback into the design. In this new release we've initially focused on building features supporting the users’ main goals and we'll be expanding functionality in the months to come. While some changes have been long expected, like refreshing the UI with Material Design, many changes are a result of continuous work with you, the Search Console users.
We’ve used 3 main communication channels to hear what our users are saying:
  • Help forum Top Contributors - Top Contributors in our help forums have been very helpful in bringing up topics seen in the forums. They communicate regularly with Google’s Search teams, and help the large community of Search Console users.
  • Open feedback - We analyzed open feedback comments about classic Search Console and identified the top requests coming in. Open feedback can be sent via the ‘Submit feedback’ button in Search Console. This open feedback helped us get more context around one of the top requests from the last years: more than 90 days of data in the Search Analytics (Performance) report. We learned of the need to compare to a similar period in the previous year, which confirmed that our decision to include 16 months of data might be on the right track.
  • Search Console panel - Last year we created a new communication channel by enlisting a group of four hundred randomly selected Search Console users, representing websites of all sizes. The panel members took part in almost every design iteration we had throughout the year, from explorations of new concepts through surveys, interviews and usability tests. The Search Console panel members have been providing valuable feedback which helped us test our assumptions and improve designs.
In one of these rounds we tested the new suggested design for the Performance report. Specifically we wanted to see whether it was clear how to use the ‘compare’ and ‘filter’ functionalities. To create an experience that felt as real as possible, we used a high fidelity prototype connected to real data. The prototype allowed study participants to freely interact with the user interface before even one row of production code had been written.
In this study we learned that the ‘compare’ functionality was often overlooked. We consequently changed the design with ‘filter’ and ‘compare’ appearing in a unified dialogue box, triggered when the ‘Add new’ chip is clicked. We continue to test this design and others to optimize its usability and usefulness.
We incorporated user feedback not only in practical design details, but also in architectural decisions. For example, user feedback led us to make major changes in the product’s core information architecture influencing the navigation and product structure of the new Search Console. The error and coverage reports were originally separated which could lead to multiple views of the same error. As a result of user feedback we united the error and coverage reporting offering one holistic view.
As the launch date grew closer, we performed several larger scale experiments. We A/B tested some of the new Search Console reports against the existing reports with 30,000 users. We tracked issue fix rates to verify new Search Console drives better results and sent out follow-up surveys to learn about their experience. This most recent feedback confirmed that export functionality was not a nice-to-have, but rather a requirement for many users and helped us tune detailed help pages in the initial release.
We are happy to announce that the new Search Console is now available to all sites. Whether it is through Search Console’s feedback button or through the user panel, we truly value a collaborative design process, where all of our users can help us build the best product.
Try out the new search console.
We're not finished yet! Which feature would you love to see in the next iteration of Search Console? Let us know below.

Introducing the new Webmaster Video Series

Google has a broad range of resources to help you better understand your website and improve its performance. This Webmaster Central Blog, the Help Center, the Webmaster forum, and the recently released Search Engine Optimization (SEO) Starter Guide are just a few.

We also have a YouTube channel, for answers to your questions in video format. To help with short & to the point answers to specific questions, we've just launched a new series, which we call SEO Snippets.

In this series of short videos, the Google team will be answering some of the webmaster and SEO questions that we regularly see on the Webmaster Central Help Forum. From 404 errors, how and when crawling works, a site's URL structure, to duplicate content, we'll have something here for you.

Check out the links shared in the videos to get more helpful webmaster information, drop by our help forum and subscribe to our YouTube channel for more tips and insights!


The new Search Console: a sneak peek at two experimental features

Search Console was initially launched with just four reports more than a decade ago. Today, the product includes more than two dozen reports and tools covering AMP, structured data, and live testing tools, all designed to help improve your site's performance on Google Search.
Now we have decided to embark on an extensive redesign to better serve you, our users. Our hope is that this redesign will provide you with:
  • More actionable insights - We will now group the identified issues by what we suspect is the common “root-cause” to help you find where you should fix your code. We organize these issues into tasks that have a state (similar to bug tracking systems) so you can easily see whether the issue is still open, whether Google has detected your fix, and track the progress of re-processing the affected pages.
  • Better support of your organizational workflow - As we talked to many organizations, we’ve learned that multiple people are typically involved in implementing, diagnosing, and fixing issues. This is why we are introducing sharing functionality that allows you to pick-up an action item and share it with other people in your group, like developers who will get references to the code in question.
  • Faster feedback loops between you and Google - We’ve built a mechanism to allow you to iterate quickly on your fixes, and not waste time waiting for Google to recrawl your site, only to tell you later that it’s not fixed yet. Rather, we’ll provide on-the-spot testing of fixes and are automatically speeding up crawling once we see things are ok. Similarly, the testing tools will include code snippets and a search preview - so you can quickly see where your issues are, confirm you've fixed them, and see how the pages will look on Search.
In the next few weeks, we're releasing two exciting BETA features from the new Search Console to a small set of users — Index Coverage report and AMP fixing flow.

The new Index Coverage report shows the count of indexed pages, information about why some pages could not be indexed, along with example pages and tips on how to fix indexing issues. It also enables a simple sitemap submission flow, and the capability to filter all Index Coverage data to any of the submitted sitemaps.
Here’s a peek of our new Index Coverage report:

The new AMP fixing flow

The new AMP fixing experience starts with the AMP Issues report. This report shows the current AMP issues affecting your site, grouped by the underlying error. Drill down into an issue to get more details, including sample affected pages. After you fix the underlying issue, click a button to verify your fix, and have Google recrawl the pages affected by that issue. Google will notify you of the progress of the recrawl, and will update the report as your fixes are validated.
As we start to experiment with these new features, some users will be introduced to the new redesign through the coming weeks.

Webmaster Forums Top AMP Questions

It has been busy here at Google Webmaster Central over the last few weeks, covering a lot of details about Accelerated Mobile Pages that we hope you have found useful. The topics have included:

We’ve also been seeing a few AMP questions coming to the Webmaster forums about getting started using AMP on Google Search. To help, we’ve compiled some of the most common questions we’ve seen:

Q: I’m considering creating AMP pages for my website. What is the benefit? What types of sites and pages is AMP for?

Users love content that loads fast and without any fuss - using the AMP format may make it more compelling for people to consume and engage with your content on mobile devices. Research has shown that 40% of users abandon a site that takes more than three seconds to load. The Washington Post observed an 88% decrease in article loading time and a 23% increase in returning users from mobile search from adopting AMP.

The AMP format is great for all types of static web content such as news, recipes, movie listings, product pages, reviews, videos, blogs and more.

Q: We are getting errors logged in Search Console for AMP pages; however, we already fixed these issues. Why are we still seeing errors?

The short answer is that changes to your AMP pages take about a week to be updated in Search Console. For a more in-depth answer on why, Google’s Webmaster Trends Analyst John Mueller shared a detailed post on Search Console latency challenges.

Q: Our AMP pages are not showing up on Google Search. What should we do?

Only valid AMP pages will be eligible to show on Google Search. Check the validity of your  AMP pages by using the AMP HTML Web Validator, the Chrome or Opera Extension or through a more automated process such as a cron job to make sure all new content is valid.

While it’s good practise overall to include schema.org structured data in your AMP pages (we recommend JSON-LD), it's especially important for news publishers. News content that includes valid markup properties are eligible to be shown within the Top Stories section in Google Search results. To test your structured data, try using the structured data testing tool.

If you have more questions that are not answered here, share your feedback in the comments below or on our Google Webmasters Google+ page. Or as usual, feel free to post in our Webmasters Help Forum.

An update on the Webmaster Central Blog

We’ve got a new URL!

You may have noticed the Google Webmaster Central blog has a new address: webmasters.googleblog.com.

That’s because starting today, Google is moving its blogs to a new domain to help people recognize when they’re reading an official blog from Google. These changes will roll out to all of Google’s blogs over time.

The previous address will redirect to the new domain, so your bookmarks and links will continue to work. Unfortunately, as with a custom domain change in Blogger, the Google+ comments on the blogs have been reset.

Thanks as always for reading—we’ll see you here again soon at webmasters.googleblog.com!