Author Archives: Google Webmaster Central

Launching SEO Audit category in Lighthouse Chrome extension


We're happy to announce that we are introducing another audit category to the Lighthouse Chrome Extension: SEO Audits.

Lighthouse is an open-source, automated auditing tool for improving the quality of web pages. It provides a well-lit path for improving the quality of sites by allowing developers to run audits for performance, accessibility, progressive web apps compatibility and more. Basically, it "keeps you from crashing into the rocks", hence the name Lighthouse.

The SEO audit category within Lighthouse enables developers and webmasters to run a basic SEO health-check for any web page that identifies potential areas for improvement. Lighthouse runs locally in your Chrome browser, enabling you to run the SEO audits on pages in a staging environment as well as on live pages, public pages and pages that require authentication.

Bringing SEO best practices to you

The current list of SEO audits is not an exhaustive list, nor does it make any SEO guarantees for Google websearch or other search engines. The current list of audits was designed to validate and reflect the SEO basics that every site should get right, and provides detailed guidance to developers and SEO practitioners of all skill levels. In the future, we hope to add more and more in-depth audits and guidance — let us know if you have suggestions for specific audits you'd like to see!

How to use it

Currently there are two ways to run these audits.

Using the Lighthouse Chrome Extension:

  1. Install the Lighthouse Chrome Extension
  2. Click on the Lighthouse icon in the extension bar 
  3. Select the Options menu, tick “SEO” and click OK, then Generate report

Running SEO Audits in Lighthouse extension


Using Chrome Developer tools on Chrome Canary:
  1. Open Chrome Developer Tools 
  2. Go to Audits 
  3. Click Perform an audit 
  4. Tick the “SEO” checkbox and click Run Audit

Running SEO Audits in Chrome Canary

The current Lighthouse Chrome extension contains an initial set of SEO audits which we’re planning to extend and enhance in the future. Once we're confident of its functionality, we’ll make the audits available by default in the stable release of Chrome Developer Tools.

We hope you find this functionality useful for your current and future projects. If these basic SEO tips are totally new to you and you find yourself interested in this area, make sure to read our complete SEO starter-guide! Leave your feedback and suggestions in the comments section below, on GitHub or on our Webmaster forum.

Happy auditing!

Posted by Valentyn, Webmaster Outreach Strategist.

Using page speed in mobile search ranking

People want to be able to find answers to their questions as fast as possible — studies show that people really care about the speed of a page. Although speed has been used in ranking for some time, that signal was focused on desktop searches. Today we’re announcing that starting in July 2018, page speed will be a ranking factor for mobile searches.

The “Speed Update,” as we’re calling it, will only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries. It applies the same standard to all pages, regardless of the technology used to build the page. The intent of the search query is still a very strong signal, so a slow page may still rank highly if it has great, relevant content.

We encourage developers to think broadly how about performance affects a user’s experience of their page and to consider a variety of user experience metrics. Although there is no tool that directly indicates whether a page is affected by this new ranking factor, here are some resources that can be used to evaluate a page’s performance.

  • Chrome User Experience Report, a public dataset of key user experience metrics for popular destinations on the web, as experienced by Chrome users under real-world conditions
  • Lighthouse, an automated tool and a part of Chrome Developer Tools for auditing the quality (performance, accessibility, and more) of web pages
  • PageSpeed Insights, a tool that indicates how well a page performs on the Chrome UX Report and suggests performance optimizations

As always, if you have any questions or feedback, please visit our webmaster forums.

Real-world data in PageSpeed Insights

PageSpeed Insights provides information about how well a page adheres to a set of best practices. In the past, these recommendations were presented without the context of how fast the page performed in the real world, which made it hard to understand when it was appropriate to apply these optimizations. Today, we’re announcing that PageSpeed Insights will use data from the Chrome User Experience Report to make better recommendations for developers and the optimization score has been tuned to be more aligned with the real-world data.

The PSI report now has several different elements:

  • The Speed score categorizes a page as being Fast, Average, or Slow. This is determined by looking at the median value of two metrics: First Contentful Paint (FCP) and DOM Content Loaded (DCL). If both metrics are in the top one-third of their category, the page is considered fast.
  • The Optimization score categorizes a page as being Good, Medium, or Low by estimating its performance headroom. The calculation assumes that a developer wants to keep the same appearance and functionality of the page.
  • The Page Load Distributions section presents how this page’s FCP and DCL events are distributed in the data set. These events are categorized as Fast (top third), Average (middle third), and Slow (bottom third) by comparing to all events in the Chrome User Experience Report.
  • The Page Stats section describes the round trips required to load the page’s render-blocking resources, the total bytes used by the page, and how it compares to the median number of round trips and bytes used in the dataset. It can indicate if the page might be faster if the developer modifies the appearance and functionality of the page.
  • Optimization Suggestions is a list of best practices that could be applied to this page. If the page is fast, these suggestions are hidden by default, as the page is already in the top third of all pages in the data set.

For more details on these changes, see About PageSpeed Insights. As always, if you have any questions or feedback, please visit our forums and please remember to include the URL that is being evaluated.


Introducing the new Search Console

A few months ago we released a beta version of a new Search Console experience to a limited number of users. We are now starting to release this beta version to all users of Search Console, so that everyone can explore this simplified process of optimizing a website's presence on Google Search. The functionality will include Search performance, Index Coverage, AMP status, and Job posting reports. We will send a message once your site is ready in the new Search Console.

We started by adding some of the most popular functionality in the new Search Console (which can now be used in your day-to-day flow of addressing these topics). We are not done yet, so over the course of the year the new Search Console (beta) will continue to add functionality from the classic Search Console. Until the new Search Console is complete, both versions will live side-by-side and will be easily interconnected via links in the navigation bar, so you can use both.

The new Search Console was rebuilt from the ground up by surfacing the most actionable insights and creating an interaction model which guides you through the process of fixing any pending issues. We’ve also added ability to share reports within your own organization in order to simplify internal collaboration.

Search Performance: with 16 months of data!


If you've been a fan of Search Analytics, you'll love the new Search Performance report. Over the years, users have been consistent in asking us for more data in Search Analytics. With the new report, you'll have 16 months of data, to make analyzing longer-term trends easier and enable year-over-year comparisons. In the near future, this data will also be available via the Search Console API.

Index Coverage: a comprehensive view on Google's indexing


The updated Index Coverage report gives you insight into the indexing of URLs from your website. It shows correctly indexed URLs, warnings about potential issues, and reasons why Google isn't indexing some URLs. The report is built on our new Issue tracking functionality that alerts you when new issues are detected and helps you monitor their fix.

So how does that work?

  1. When you drill down into a specific issue you will see a sample of URLs from your site. Clicking on error URLs brings up the page details with links to diagnostic-tools that help you understand what is the source of the problem.
  2. Fixing Search issues often involves multiple teams within a company. Giving the right people access to information about the current status, or about issues that have come up there, is critical to improving an implementation quickly. Now, within most reports in the new Search Console, you can do that with the share button on top of the report which will create a shareable link to the report. Once things are resolved, you can disable sharing just as easily.
  3. The new Search Console can also help you confirm that you've resolved an issue, and help us to update our index accordingly. To do this, select a flagged issue, and click validate fix. Google will then crawl and reprocess the affected URLs with a higher priority, helping your site to get back on track faster than ever.
  4. The Index Coverage report works best for sites that submit sitemap files. Sitemap files are a great way to let search engines know about new and updated URLs. Once you've submitted a sitemap file, you can now use the sitemap filter over the Index Coverage data, so that you're able to focus on an exact list of URLs.

Search Enhancements: improve your AMP and Job Postings pages

The new Search Console is also aimed at helping you implement Search Enhancements such as AMP and Job Postings (more to come). These reports provide details into the specific errors and warnings that Google identified for these topics. In addition to the functionally described in the index coverage report, we augmented the reports with two extra features:

  • The first feature is aimed at providing faster feedback in the process of fixing an issue. This is achieved by running several instantaneous tests once you click the validate fix button. If your pages don’t pass this test we provide you with an immediate notification, otherwise we go ahead and reprocess the rest of the affected pages.
  • The second new feature is aimed at providing positive feedback during the fix process by expanding the validation log with a list of URLs that were identified as fixed (in addition to URLs that failed the validation or might still be pending).

Similar to the AMP report, the new Search Console provides a Job postings report. If you have jobs listings on your website, you may be eligible to have those shown directly through Google for Jobs (currently only available in certain locations).

Feedback welcome

We couldn’t have gotten so far without the ongoing feedback from our diligent trusted testers (we plan to share more on how their feedback helped us dramatically improve Search Console). However, your continued feedback is critical for us: if there's something you find confusing or wrong, or if there's something you really like, please let us know through the feedback feature in the sidebar. Also note that the mobile experience in the new Search Console is still a work in progress.

We want to end this blog sharing an encouraging response we got from a user who has been testing the new Search Console recently:

"The UX of new Search Console is clean and well laid out, everything is where we expect it to be. I can even kick-off validation of my fixes and get email notifications with the result. It’s been a massive help in fixing up some pesky AMP errors and warnings that were affecting pages on our site. On top of all this, the Search Analytics report now extends to 16 months of data which is a total game changer for us" - Noah Szubski, Chief Product Officer, DailyMail.com

Are there any other tools that would make your life as a webmaster easier? Let us know in the comments here, and feel free to jump into our webmaster help forums to discuss your ideas with others!


Introducing the new Webmaster Video Series

Google has a broad range of resources to help you better understand your website and improve its performance. This Webmaster Central Blog, the Help Center, the Webmaster forum, and the recently released Search Engine Optimization (SEO) Starter Guide are just a few.

We also have a YouTube channel, for answers to your questions in video format. To help with short & to the point answers to specific questions, we've just launched a new series, which we call SEO Snippets.

In this series of short videos, the Google team will be answering some of the webmaster and SEO questions that we regularly see on the Webmaster Central Help Forum. From 404 errors, how and when crawling works, a site's URL structure, to duplicate content, we'll have something here for you.

Check out the links shared in the videos to get more helpful webmaster information, drop by our help forum and subscribe to our YouTube channel for more tips and insights!


Introducing Rich Results & the Rich Results Testing Tool

Over the years, the different ways you can choose to highlight your website's content in search has grown dramatically. In the past, we've called these rich snippets, rich cards, or enriched results. Going forward - to simplify the terminology -  our documentation will use the name "rich results" for all of them. Additionally, we're introducing a new rich results testing tool to make diagnosing your pages' structured data easier.

The new testing tool focuses on the structured data types that are eligible to be shown as rich results. It allows you to test all data sources on your pages, such as JSON-LD (which we recommend), Microdata, or RDFa. The new tool provides a more accurate reflection of the page’s appearance on Search and includes improved handling for Structured Data found on dynamically loaded content. The tests for Recipes, Jobs, Movies, and Courses are currently supported -- but this is just a first step, we plan on expanding over time.

Testing a page is easy: just open the testing tool, enter a URL, and review the output. If there are issues, the tool will highlight the invalid code in the page source. If you're working with others on this page, the share-icon on the bottom-right lets you do that quickly. You can also use preview button to view all the different rich results the page is eligible for. And … once you're happy with the result, use Submit To Google to fetch & index this page for search.

Want to get started with rich snippets rich results? Check out our guides for marking up your content. Feel free to drop by our Webmaster Help forums should you have any questions or get stuck; the awesome experts there can often help resolve issues and give you tips in no time!


#NoHacked 3.0: Fixing common hack cases

So far on #NoHacked, we have shared some tips on detection and prevention. Now that you are able to detect hack attack, we would like to introduce some common hacking techniques and guides on how to fix them!

  • Fixing the Cloaked Keywords and Links Hack
    The cloaked keywords and link hack automatically creates many pages with nonsensical sentences, links, and images. These pages sometimes contain basic template elements from the original site, so at first glance, the pages might look like normal parts of the target site until you read the content. In this type of attack, hackers usually use cloaking techniques to hide the malicious content and make the injected page appear as part of the original site or a 404 error page.
  • Fixing the Gibberish Hack
    The gibberish hack automatically creates many pages with nonsensical sentences filled with keywords on the target site. Hackers do this so the hacked pages show up in Google Search. Then, when people try to visit these pages, they'll be redirected to an unrelated page, like a porn site for example.
  • Fixing the Japanese Keywords Hack
    The Japanese keywords hack typically creates new pages with Japanese text on the target site in randomly generated directory names. These pages are monetized using affiliate links to stores selling fake brand merchandise and then shown in Google search. Sometimes the accounts of the hackers get added in Search Console as site owners.

Lastly, after you clean your site and fix the problem, make sure to file for a reconsideration request to have our teams review your site.

If you have any questions, post your questions on our Webmaster Help Forums!

Getting your site ready for mobile-first indexing

When we announced almost a year ago that we're experimenting with mobile-first indexing, we said we'd update publishers about our progress, something that we've done the past few months through public talks in office hours on Hangouts on Air and at conferences like Pubcon.

To recap, currently our crawling, indexing, and ranking systems typically look at the desktop version of a page's content, which may cause issues for mobile searchers when that version is vastly different from the mobile version. Mobile-first indexing means that we'll use the mobile version of the content for indexing and ranking, to better help our – primarily mobile – users find what they're looking for. Webmasters will see significantly increased crawling by Smartphone Googlebot, and the snippets in the results, as well as the content on the Google cache pages, will be from the mobile version of the pages.

As we said, sites that make use of responsive web design and correctly implement dynamic serving (that include all of the desktop content and markup) generally don't have to do anything. Here are some extra tips that help ensure a site is ready for mobile-first indexing:
  • Make sure the mobile version of the site also has the important, high-quality content. This includes text, images (with alt-attributes), and videos - in the usual crawlable and indexable formats.
  • Structured data is important for indexing and search features that users love: it should be both on the mobile and desktop version of the site. Ensure URLs within the structured data are updated to the mobile version on the mobile pages.
  • Metadata should be present on both versions of the site. It provides hints about the content on a page for indexing and serving. For example, make sure that titles and meta descriptions are equivalent across both versions of all pages on the site.
  • No changes are necessary for interlinking with separate mobile URLs (m.-dot sites). For sites using separate mobile URLs, keep the existing link rel=canonical and link rel=alternate elements between these versions.
  • Check hreflang links on separate mobile URLs. When using link rel=hreflang elements for internationalization, link between mobile and desktop URLs separately. Your mobile URLs' hreflang should point to the other language/region versions on other mobile URLs, and similarly link desktop with other desktop URLs using hreflang link elements there.
  • Ensure the servers hosting the site have enough capacity to handle potentially increased crawl rate. This doesn't affect sites that use responsive web design and dynamic serving, only sites where the mobile version is on a separate host, such as m.example.com.
We will be evaluating sites independently on their readiness for mobile-first indexing based on the above criteria and transitioning them when ready. This process has already started for a handful of sites and is closely being monitored by the search team.

We continue to be cautious with rolling out mobile-first indexing. We believe taking this slowly will help webmasters get their sites ready for mobile users, and because of that, we currently don't have a timeline for when it's going to be completed. If you have any questions, drop by our Webmaster forums or our public events.

Posted by Gary

Getting your site ready for mobile-first indexing

When we announced almost a year ago that we're experimenting with mobile-first indexing, we said we'd update publishers about our progress, something that we've done the past few months through public talks in office hours on Hangouts on Air and at conferences like Pubcon.

To recap, currently our crawling, indexing, and ranking systems typically look at the desktop version of a page's content, which may cause issues for mobile searchers when that version is vastly different from the mobile version. Mobile-first indexing means that we'll use the mobile version of the content for indexing and ranking, to better help our – primarily mobile – users find what they're looking for. Webmasters will see significantly increased crawling by Smartphone Googlebot, and the snippets in the results, as well as the content on the Google cache pages, will be from the mobile version of the pages.

As we said, sites that make use of responsive web design and correctly implement dynamic serving (that include all of the desktop content and markup) generally don't have to do anything. Here are some extra tips that help ensure a site is ready for mobile-first indexing:
  • Make sure the mobile version of the site also has the important, high-quality content. This includes text, images (with alt-attributes), and videos - in the usual crawlable and indexable formats.
  • Structured data is important for indexing and search features that users love: it should be both on the mobile and desktop version of the site. Ensure URLs within the structured data are updated to the mobile version on the mobile pages.
  • Metadata should be present on both versions of the site. It provides hints about the content on a page for indexing and serving. For example, make sure that titles and meta descriptions are equivalent across both versions of all pages on the site.
  • No changes are necessary for interlinking with separate mobile URLs (m.-dot sites). For sites using separate mobile URLs, keep the existing link rel=canonical and link rel=alternate elements between these versions.
  • Check hreflang links on separate mobile URLs. When using link rel=hreflang elements for internationalization, link between mobile and desktop URLs separately. Your mobile URLs' hreflang should point to the other language/region versions on other mobile URLs, and similarly link desktop with other desktop URLs using hreflang link elements there.
  • Ensure the servers hosting the site have enough capacity to handle potentially increased crawl rate. This doesn't affect sites that use responsive web design and dynamic serving, only sites where the mobile version is on a separate host, such as m.example.com.
We will be evaluating sites independently on their readiness for mobile-first indexing based on the above criteria and transitioning them when ready. This process has already started for a handful of sites and is closely being monitored by the search team.

We continue to be cautious with rolling out mobile-first indexing. We believe taking this slowly will help webmasters get their sites ready for mobile users, and because of that, we currently don't have a timeline for when it's going to be completed. If you have any questions, drop by our Webmaster forums or our public events.

Posted by Gary

#NoHacked 3.0: Tips on prevention

Last week on #NoHacked, we have shared on hack detection and the reasons why you might get hacked. This week we focus on prevention and here are some tips for you!

  • Be mindful of your sources! Be very careful of a free premium theme/plugin!

You probably have heard about free premium plugins! If you've ever stumbled upon a site offering you plugins you normally have to purchase for free, be very careful. Many hackers lure you in by copying a popular plugin and then add backdoors or malware that will allow them to access your site. Read more about a similar case on the Sucuri blog. Additionally, even legit good quality plugins and themes can become dangerous if:

  • you do not update them as soon as a new version becomes available
  • the developer of said theme or plugin does not update them, and they become old over time.
In any case, keeping all your site's software modern and updated is essential in keeping hackers out of your website.

  • Botnet in wordpress
    A botnetis a cluster of machines, devices, or websites under the control of a third party often used to commit malicious acts, such as operating spam campaigns, clickbots, or DDoS. It's difficult to detect if your site has been infected by a botnet because there are often no specific changes to your site. However, your site's reputation, resources, and data are at risk if your site is in a botnet. Learn more about botnets, how to detect them, and how they can affect your site at Botnet in wordpress and joomla article.

As usual if you have any questions post on our Webmaster Help Forums for help from the friendly community and see you next week!