Tag Archives: webmaster guidelines

Send your recipes to the Google Assistant

Last year, we launched Google Home with recipe guidance, providing users with step-by-step instructions for cooking recipes. With more people using Google Home every day, we're publishing new guidelines so your recipes can support this voice guided experience. You may receive traffic from more sources, since users can now discover your recipes through the Google Assistant on Google Home. The updated structured data properties provide users with more information about your recipe, resulting in higher quality traffic to your site.

Updated recipe properties to help users find your recipes

We updated our recipe developer documentation to help users find your recipes and experience them with Google Search and the Google Assistant on Google Home. This will enable more potential traffic to your site. To ensure that users can access your recipe in more ways, we need more information about your recipe. We now recommend the following properties:

  • Videos: Show users how to make the dish by adding a video array
  • Category: Tell users the type of meal or course of the dish (for example, "dinner", "dessert", "entree")
  • Cuisine: Specify the region associated with your recipe (for example, "Mediterranean", "American", "Cantonese")
  • Keywords: Add other terms for your recipe such as the season ("summer"), the holiday ("Halloween", "Diwali"), the special event ("wedding", "birthday"), or other descriptors ("quick", "budget", "authentic")

We also added more guidance for recipeInstructions. You can specify each step of the recipe with the HowToStep property, and sections of steps with the HowToSection property.

Add recipe instructions and ingredients for the Google Assistant

We now require the recipeIngredient and recipeInstructions properties if you want to support the Google Assistant on Google Home. Adding these properties can make your recipe eligible for integration with the Google Assistant, enabling more users to discover your recipes. If your recipe doesn't have these properties, it won't be eligible for guidance with the Google Assistant, but it can still be eligible to appear in Search results.

For more information, visit our Recipe developer documentation. If you have questions about the feature, please ask us in the Webmaster Help Forum.

We updated our job posting guidelines

Last year, we launched job search on Google to connect more people with jobs. When you provide Job Posting structured data, it helps drive more relevant traffic to your page by connecting job seekers with your content. To ensure that job seekers are getting the best possible experience, it's important to follow our Job Posting guidelines.

We've recently made some changes to our Job Posting guidelines to help improve the job seeker experience.

  • Remove expired jobs
  • Place structured data on the job's detail page
  • Make sure all job details are present in the job description

Remove expired jobs

When job seekers put in effort to find a job and apply, it can be very discouraging to discover that the job that they wanted is no longer available. Sometimes, job seekers only discover that the job posting is expired after deciding to apply for the job. Removing expired jobs from your site may drive more traffic because job seekers are more confident when jobs that they visit on your site are still open for application. For more information on how to remove a job posting, see Remove a job posting.


Place structured data on the job's detail page

Job seekers find it confusing when they land on a list of jobs instead of the specific job's detail page. To fix this, put structured data on the most detailed leaf page possible. Don't add structured data to pages intended to present a list of jobs (for example, search result pages) and only add it to the most specific page describing a single job with its relevant details.

Make sure all job details are present in the job description

We've also noticed that some sites include information in the JobPosting structured data that is not present anywhere in the job posting. Job seekers are confused when the job details they see in Google Search don't match the job description page. Make sure that the information in the JobPosting structured data always matches what's on the job posting page. Here are some examples:

  • If you add salary information to the structured data, then also add it to the job posting. Both salary figures should match.
  • The location in the structured data should match the location in the job posting.

Providing structured data content that is consistent with the content of the job posting pages not only helps job seekers find the exact job that they were looking for, but may also drive more relevant traffic to your job postings and therefore increase the chances of finding the right candidates for your jobs.

If your site violates the Job Posting guidelines (including the guidelines in this blog post), we may take manual action against your site and it may not be eligible for display in the jobs experience on Google Search. You can submit a reconsideration request to let us know that you have fixed the problem(s) identified in the manual action notification. If your request is approved, the manual action will be removed from your site or page.

For more information, visit our Job Posting developer documentation and our JobPosting FAQ.

Protect your site from user generated spam

As a website owner, you might have come across some auto-generated content in comments sections or forum threads. When such content is created on your pages, not only does it disrupt those visiting your site, but it also shows some content that you may not want to be associated with your site to Google and other search engines.


In this blog post, we will give you tips to help you deal with this type of spam in your site and forum.


Some spammers abuse sites owned by others by posting deceiving content and links, in an attempt to get more traffic to their sites. Here are a few examples:


Comments and forum threads can be a really good source of information and an efficient way of engaging a site's users in discussions. This valuable content should not be buried by auto-generated keywords and links placed there by spammers.


There are many ways of securing your site’s forums and comment threads and making them unattractive to spammers:


  • Keep your forum software updated and patched. Take the time to keep your software up-to-date and pay special attention to important security updates. Spammers take advantage of security issues in older versions of blogs, bulletin boards, and other content management systems.

  • Add a CAPTCHA. CAPTCHAs require users to confirm that they are not robots in order to prove they're a human being and not an automated script. One way to do this is to use a service like reCAPTCHA, Securimage and  Jcaptcha .
  • Block suspicious behavior. Many forums allow you to set time limits between posts, and you can often find plugins to look for excessive traffic from individual IP addresses or proxies and other activity more common to bots than human beings. For example, phpBB, Simple Machines, myBB, and many other forum platforms enable such configurations.

  • Check your forum’s top posters on a daily basis. If a user joined recently and has an excessive amount of posts, then you probably should review their profile and make sure that their posts and threads are not spammy.

  • Consider disabling some types of comments. For example, It’s a good practice to close some very old forum threads that are unlikely to get legitimate replies.
    If you plan on not monitoring your forum going forward and users are no longer interacting with it, turning off posting completely may prevent spammers from abusing it.

  • Make good use of moderation capabilities. Consider enabling features in moderation that require users to have a certain reputation before links can be posted or where comments with links require moderation.
    If possible, change your settings so that you disallow anonymous posting and make posts from new users require approval before they're publicly visible.
    Moderators, together with your friends/colleagues and some other trusted users can help you review and approve posts while spreading the workload. Keep an eye on your forum's new users by looking on their posts and activities on your forum.  

  • Consider blacklisting obviously spammy terms. Block obviously inappropriate comments with a blacklist of spammy terms (e.g. Illegal streaming or pharma related terms) . Add inappropriate and off-topic terms that are only used by spammers, learn from the spam posts that you often see on your forum or other forums. Built-in features or plugins can delete or mark comments as spam for you.

  • Use the "nofollow" attribute for links in the comment field. This will deter spammers from targeting your site. By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.

  • Use automated systems to defend your site.  Comprehensive systems like Akismet, which has plugins for many blogs and forum systems are easy to install and do most of the work for you.




For detailed information about these topics, check out our Help Center document on User Generated Spam and comment spam. You can also visit our Webmaster Central Help Forum if you need any help.



A reminder about widget links

Google has long taken a strong stance against links that manipulate a site’s PageRank. Today we would like to reiterate our policy on the creation of keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites.

Widgets can help website owners enrich the experience of their site and engage users. However, some widgets add links to a site that a webmaster did not editorially place and contain anchor text that the webmaster does not control. Because these links are not naturally placed, they're considered a violation of Google Webmaster Guidelines.

Below you can find the examples of widgets which contain links that violate Google Webmaster Guidelines:



Google's webspam team may take manual actions on unnatural links. When a manual action is taken, Google will notify the site owners through Search Console. If you receive such a warning for unnatural links to your site and you use links in widgets to promote your site, we recommend resolving these issues and requesting reconsideration.

You can resolve issues with unnatural links by making sure they don't pass PageRank. To do this, add a rel="nofollow" attribute on the widget links or remove the links entirely. After fixing or removing widget links and any other unnatural links to your site, let Google know about your change by submitting a reconsideration request in Search Console. Once the request has been reviewed, you'll get a notification about whether the reconsideration request was successful or not.

Also, we would like to remind webmasters who use widgets on their sites to check those widgets for any unnatural links. Add a rel="nofollow" attribute on those unnatural links or remove the links entirely from the widget.

For more information, please watch our video about widget links and refer to our Webmaster Guidelines on Link Schemes. Additionally, feel free to ask questions in our Webmaster Help Forums, where a community of webmasters can help with their experience.


Best practices for bloggers reviewing free products they receive from companies

As a form of online marketing, some companies today will send bloggers free products to review or give away in return for a mention in a blogpost. Whether you’re the company supplying the product or the blogger writing the post, below are a few best practices to ensure that this content is both useful to users and compliant with Google Webmaster Guidelines.

  1. Use the nofollow tag where appropriate

    Links that pass PageRank in exchange for goods or services are against Google guidelines on link schemes. Companies sometimes urge bloggers to link back to:
    1. the company’s site
    2. the company’s social media accounts
    3. an online merchant’s page that sells the product
    4. a review service’s page featuring reviews of the product
    5. the company’s mobile app on an app store
    Bloggers should use the nofollow tag on all such links because these links didn’t come about organically (i.e., the links wouldn’t exist if the company hadn’t offered to provide a free good or service in exchange for a link). Companies, or the marketing firms they’re working with, can do their part by reminding bloggers to use nofollow on these links.
  2. Disclose the relationship

    Users want to know when they’re viewing sponsored content. Also, there are laws in some countries that make disclosure of sponsorship mandatory. A disclosure can appear anywhere in the post; however, the most useful placement is at the top in case users don’t read the entire post.
  3. Create compelling, unique content

    The most successful blogs offer their visitors a compelling reason to come back. If you're a blogger you might try to become the go-to source of information in your topic area, cover a useful niche that few others are looking at, or provide exclusive content that only you can create due to your unique expertise or resources.

For more information, please drop by our Google Webmaster Central Help Forum.

Detect and get rid of unwanted sneaky mobile redirects

In many cases, it is OK to show slightly different content on different devices. For example, optimizing the smaller space of a smartphone screen can mean that some content, like images, will have to be modified. Or you might want to store your website’s menu in a navigation drawer (find documentation here) to make mobile browsing easier and more effective. When implemented properly, these user-centric modifications can be understood very well by Google.

The situation is similar when it comes to mobile-only redirect. Redirecting mobile users to improve their mobile experience (like redirecting mobile users from example.com/url1 to m.example.com/url1) is often beneficial to them. But redirecting mobile users sneakily to a different content is bad for user experience and is against Google’s webmaster guidelines.


A frustrating experience: The same URL shows up in search results pages on desktop and on mobile. When a user clicks on this result on their desktop computer, the URL opens normally. However, when clicking on the same result on a smartphone, a redirect happens and an unrelated URL loads.

Who implements these mobile-only sneaky redirects?

There are cases where webmasters knowingly decide to put into place redirection rules for their mobile users. This is typically a webmaster guidelines violation, and we do take manual action against it when it harms Google users’ experience (see last section of this article).   

But we’ve also observed situations where mobile-only sneaky redirects happen without site owners being aware of it:

  • Advertising schemes that redirect mobile users specifically
    A script/element installed to display ads and monetize content might be redirecting mobile users to a completely different site without the webmaster being aware of it.
  • Mobile redirect as a result of the site being a target of hacking
    In other cases, if your website has been hacked, a potential result can be redirects to spammy domains for mobile users only.

How do I detect if my site is doing sneaky mobile redirects?

  1. Check if you are redirected when you navigate to your site on your smartphone
    We recommend you to check the mobile user experience of your site by visiting your pages from Google search results with a smartphone. When debugging, mobile emulation in desktop browsers is handy, mostly because you can test for many different devices. You can, for example, do it straight from your browser in Chrome, Firefox or Safari (for the latter, make sure you have enabled the “Show Develop menu in menu bar” feature).
  1. Listen to your users
    Your users could see your site in a different way than you do. It’s always important to pay attention to user complaints, so you can hear of any issue related to mobile UX.
  2. Monitor your users in your site’s analytics data
    Unusual mobile user activity could be detected by looking at some of the data held in your website's analytics data. For example, looking at the average time spent on your site by your mobile users could be a good signal to watch: if all of a sudden, your mobile users (and only them) start spending much less time on your site than they used to, there might be an issue related to mobile redirections.

    To be aware of wide changes in mobile user activity as soon as they happen, you can for example set up Google Analytics alerts. For example, you can set an alert to be warned in case of a sharp drop in average time spent on your site by mobile users, or a drop in mobile users (always take into account that big changes in those metrics are not a clear, direct signal that your site is doing mobile sneaky redirects).

I’ve detected sneaky redirects for my mobile users, and I did not set it up: what do I do?

  1. Make sure that your site is not hacked.
    Check the Security Issues tool in the Search Console, if we have noticed any hack, you should get some information there.
    Review our additional resources on typical symptoms of hacked sites, and our case studies on hacked sites.
  2. Audit third-party scripts/elements on your site
    If your site is not hacked, then we recommend you take the time to investigate if third-party scripts/elements are causing the redirects. You can follow these steps:
    A. Remove one by one the third-party scripts/elements you do not control from the redirecting page(s).
    B. Check your site on a mobile device or through emulation between each script/element removal, and see when the redirect stops.
    C. If you think a particular script/element is responsible for the sneaky redirect, consider removing it from your site, and debugging the issue with the script/element provider.

Last Thoughts on Sneaky Mobile Redirects

It's a violation of the Google Webmaster Guidelines to redirect a user to a page with the intent of displaying content other than what was made available to the search engine crawler (more information on sneaky redirects). To ensure quality search results for our users, the Google Search Quality team can take action on such sites, including removal of URLs from our index.  When we take manual action, we send a message to the site owner via Search Console. Therefore, make sure you’ve set up a Search Console account.

Be sure to choose advertisers who are transparent on how they handle user traffic, to avoid unknowingly redirecting your own users. If you are interested in trust-building in the online advertising space, you may check out industry-wide best practices when participating in ad networks. For example, the Trustworthy Accountability Group’s (Interactive Advertising Bureau) Inventory Quality Guidelines are a good place to start. There are many ways to monetize your content with mobile solutions that provide a high quality user experience, be sure to use them.

If you have questions or comments about mobile-only redirects, join us in our Google Webmaster Support forum.


An update on how we tackle hacked spam

Recently we have started rolling out a series of algorithmic changes that aim to tackle hacked spam in our search results. A huge amount of legitimate sites are hacked by spammers and used to engage in abusive behavior, such as malware download, promotion of traffic to low quality sites, porn, and marketing of counterfeit goods or illegal pharmaceutical drugs, etc.

Website owners that don’t implement standard best practices for security can leave their websites vulnerable to being easily hacked. This can include government sites, universities, small business, company websites, restaurants, hobby organizations, conferences, etc. Spammers and cyber-criminals purposely seek out those sites and inject pages with malicious content in an attempt to gain rank and traffic in search engines.

We are aggressively targeting hacked spam in order to protect users and webmasters.

The algorithmic changes will eventually impact roughly 5% of queries, depending on the language. As we roll out the new algorithms, users might notice that for certain queries, only the most relevant results are shown, reducing the number of results shown:

This is due to the large amount of hacked spam being removed, and should improve in the near future. We are continuing tuning our systems to weed out the bad content while retaining the organic, legitimate results. If you have any questions about these changes, or want to give us feedback on these algorithms, feel free to drop by our Webmaster Help Forums.

First Click Free update

Around ten years ago when we introduced a policy called “First Click Free,” it was hard to imagine that the always-on, multi-screen, multiple device world we now live in would change content consumption so much and so fast. The spirit of the First Click Free effort was - and still is - to help users get access to high quality news with a minimum of effort, while also ensuring that publishers with a paid subscription model get discovered in Google Search and via Google News.

In 2009, we updated the FCF policy to allow a limit of five articles per day, in order to protect publishers who felt some users were abusing the spirit of this policy. Recently we have heard from publishers about the need to revisit these policies to reflect the mobile, multiple device world. Today we are announcing a change to the FCF limit to allow a limit of three articles a day. This change will be valid on both Google Search and Google News.

Google wants to play its part in connecting users to quality news and in connecting publishers to users. We believe the FCF is important in helping achieve that goal, and we will periodically review and update these policies as needed so they continue to benefit users and publishers alike. We are listening and always welcome feedback.

Questions and answers about First Click Free

Q: Do the rest of the old guidelines still apply?
A: Yes, please check the guidelines for Google News as well as the guidelines for Web Search and the associated blog post for more information.

Q: Can I apply First Click Free to only a section of my site / only for Google News (or only for Web Search)?
A: Sure! Just make sure that both Googlebot and users from the appropriate search results can view the content as required. Keep in mind that showing Googlebot the full content of a page while showing users a registration page would be considered cloaking.

Q: Do I have to sign up to use First Click Free?
A: Please let us know about your decision to use First Click Free if you are using it for Google News. There's no need to inform us of the First Click Free status for Google Web Search.

Q: What is the preferred way to count a user's accesses?
A: Since there are many different site architectures, we believe it's best to leave this up to the publisher to decide.

(Please see our related blog post for more information on First Click Free for Google News.)


Helping hacked sites with reconsideration requests

Thus far in 2015 we have seen a 180% increase in the number of sites getting hacked and a 300% increase in hacked site reconsideration requests. While we are working hard to help webmasters prevent hacks in the first place through efforts such as blog posts and #NoHacked campaigns, we recognize that our reconsideration process is an important part of making recovering from a hack faster and easier. Here's what we've been focussing on:

1) Improved communication
2) Better tools
3) Continuous feedback loop

1. Improving communications with webmasters of hacked sites

Last year we launched the "Note from your reviewer" feature in our reconsideration process. This feature enables us to give specific examples and advice tailored to each case in response to a reconsideration request. Thus far in 2015 we have sent a customized note to over 70% of webmasters whose hacked reconsideration request was rejected, with specific guidance on where and how to find the remaining hacked content. The results have been encouraging, as we've seen a 29% decrease in the average amount of time from when a site receives a hacked manual action to the time when the webmaster cleans up and the manual action is removed.


Example "note from your reviewer" with detailed guidance and a custom example of hacked text and a hacked page

We have also completed our second #NoHacked campaign, with more detailed help on preventing and recovering from hacks. In the campaign, we focused on ways to improve the security on your site as well as ways to fix your site if it was compromised. You can catch up by reading the first post.

2. Better tools including auto-removal of some hacked manual actions

Last year we launched the "Fetch and Render" feature to the Fetch as Google tool, which allows you to see the website exactly as Googlebot sees it. This functionality is useful in recovering from a hack, since many hackers inject cloaked content that's not visible to the normal user but obvious to search engine crawlers like Googlebot.

This year we also launched the Hacked Sites Troubleshooter in 23 languages which guides webmasters through some basic steps to recover from a hack. Let us know if you have found the troubleshooter useful as we're continuing to expand its features and impact.

Finally, we're beta testing the automated removal of some hacked manual actions. In Search Console if Google sees a "Hacked site" manual action under "Partial matches", and our systems detect that the hacked content is no longer present, in some cases we will automatically remove that manual action. We still recommend that you submit a reconsideration request if you see any manual actions, but don't be surprised if a "Hacked site" manual action disappears and saves you the trouble!


Example of a Hacked site manual action on a Partial match: if our systems detect that the hacked content is no longer present, in some cases we will automatically remove the manual action

3. Soliciting your feedback and taking action

Our improved communication and tools have come directly from feedback we've collected from webmasters of sites that have been hacked. For example, earlier this year we hosted webmasters who have been through the hacked reconsideration process in both Mountain View, USA and Dublin, Ireland for brainstorming sessions. We also randomly sampled webmasters that had been through a hacked reconsideration. We found that while only 15% of webmasters were dissatisfied with the process, the main challenges those webmasters faced were in clearer notification of their site being hacked and clearer guidance on how to resolve the hack. This feedback contributed directly our more detailed blog post on hacked recovery, and to much of the content in our latest #NoHacked campaign.

(for hi-res version) https://goo.gl/photos/TkvkwYt23MpVHBwz6  



Googlers in Dublin brainstorming ways to improve the hacked reconsideration process after meeting with local webmasters


We will continue to support webmasters of hacked sites through the methods detailed above, in addition to the Webmasters help for hacked sites portal and the security, malware & hacked sites section of our forum. And we'd love to hear your ideas in the comments below on how Google can better support webmasters recovering from a hacked website!

Repeated violations of Webmaster Guidelines

In order to protect the quality of our search results, we take automated and manual actions against sites that violate our Webmaster Guidelines. When your site has a manual action taken, you can confirm in the [Manual Actions] page in Search Console which part of your site the action was taken and why. After fixing the site, you can send a reconsideration request to Google. Many webmasters are getting their manual action revoked by going through the process.

However, some sites violate the Webmaster Guidelines repeatedly after successfully going through the reconsideration process. For example, a webmaster who received a Manual Action notification based on an unnatural link to another site may nofollow the link, submit a reconsideration request, then, after successfully being reconsidered, delete the nofollow for the link. Such repeated violations may make a successful reconsideration process more difficult to achieve. Especially when the repeated violation is done with a clear intention to spam, further action may be taken on the site.

In order to avoid such situations, we recommend that webmasters avoid violating our Webmaster Guidelines, let alone repeating it. We, the Search Quality Team, will continue to protect users by removing spam from our search results.