Ads

What’s “webspam,” as Google calls it, or search spam? Pages that try to gain better rankings through things like:

Tuesday, July 31, 2012


  • Keyword stuffing
  • Link schemes
  • Cloaking, “sneaky” redirects or “doorway” pages
  • Purposeful duplicate content

Keyword stuffing

"Keyword stuffing" refers to the practice of loading a webpage with keywords in an attempt to manipulate a site's ranking in Google's search results. Filling pages with keywords results in a negative user experience, and can harm your site's ranking. Focus on creating useful, information-rich content that uses keywords appropriately and in context.

To fix this problem, review your site for misused keywords. Typically, these will be lists or paragraphs of keywords, often randomly repeated. Check carefully, because keywords can often be in the form of hidden text, or they can be hidden in title tags or alt attributes.

Once you've made your changes and are confident that your site no longer violates our guidelines, submit your site for reconsideration.

  • Link schemes

Your site's ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links count towards your rating. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity. However, some webmasters engage in link exchange schemes and build partner pages exclusively for the sake of cross-linking, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. This is in violation of Google's Webmaster Guidelines and can negatively impact your site's ranking in search results. Examples of link schemes can include:
Links intended to manipulate PageRank
Links to web spammers or bad neighborhoods on the web
Excessive reciprocal links or excessive link exchanging ("Link to me and I'll link to you.")

  • Buying or selling links that pass PageRank

The best way to get other sites to create relevant links to yours is to create unique, relevant content that can quickly gain popularity in the Internet community. The more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it. Before making any single decision, you should ask yourself the 

question: Is this going to be beneficial for my page's visitors?

It is not only the number of links you have pointing to your site that matters, but also the quality and relevance of those links. Creating good content pays off: Links are usually editorial votes given by choice, and the buzzing blogger community can be an excellent place to generate interest.

Once you've made your changes and are confident that your site no longer violates our guidelines, submit your site for reconsideration.

Cloaking, sneaky Javascript redirects, and doorway pages

  • Cloaking

What is cloaking?
Cloaking
Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user-agent may cause your site to be perceived as deceptive and removed from the Google index.
Some examples of cloaking include:
Serving a page of HTML text to search engines, while showing a page of images or Flash to users.

Serving different content to search engines than to users.
If your site contains elements that aren't crawlable by search engines (such as rich media files other than Flash, JavaScript, or images), you shouldn't provide cloaked content to search engines. Rather, you should consider visitors to your site who are unable to view these elements as well. For instance:
Provide alt text that describes images for visitors with screen readers or images turned off in their browsers.

Provide the textual contents of JavaScript in a noscript tag.
Ensure that you provide the same content in both elements (for instance, provide the same text in the JavaScript as in the noscript tag). Including substantially different content in the alternate element may cause Google to take action on the site.

  • Sneaky JavaScript redirects

When Googlebot indexes a page containing JavaScript, it will index that page but it may not follow or index any links hidden in the JavaScript itself. Use of JavaScript is an entirely legitimate web practice. However, use of JavaScript with the intent to deceive search engines is not. For instance, placing different text in JavaScript than in a noscript tag violates our Webmaster Guidelines because it displays different content for users (who see the JavaScript-based text) than for search engines (which see the noscript-based text). Along those lines, it violates the Webmaster Guidelines to embed a link in JavaScript that redirects the user to a different page with the intent to show the user a different page than the search engine sees. When a redirect link is embedded in JavaScript, the search engine indexes the original page rather than following the link, whereas users are taken to the redirect target. Like cloaking, this practice is deceptive because it displays different content to users and to Googlebot, and can take a visitor somewhere other than where they intended to go.

Note that placement of links within JavaScript is alone not deceptive. When examining JavaScript on your site to ensure your site adheres to our guidelines, consider the intent.
Keep in mind that since search engines generally can't access the contents of JavaScript, legitimate links within JavaScript will likely be inaccessible to them (as well as to visitors without Javascript-enabled browsers). You might instead keep links outside of JavaScript or replicate them in a noscript tag.

  • Doorway pages

Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase. In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination.

Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users, and are in violation of our Webmaster Guidelines.

Google's aim is to give our users the most valuable and relevant search results. Therefore, we frown on practices that are designed to manipulate search engines and deceive users by directing them to sites other than the ones they selected, and that provide content solely for the benefit of search engines. Google may take action on doorway sites and other sites making use of these deceptive practice, including removing these sites from the Google index.

If your site has been removed from our search results, review our Webmaster Guidelines for more information. Once you've made your changes and are confident that your site no longer violates our guidelines, submit your site for reconsideration.

  • Duplicate content

Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin. Examples of non-malicious duplicate content could include:

Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices

Store items shown or linked via multiple distinct URLs
Printer-only versions of web pages

If your site contains multiple pages with largely identical content, there are a number of ways you can indicate your preferred URL to Google. (This is called "canonicalization".) More information about canonicalization.

However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.

Google tries hard to index and show pages with distinct information. This filtering means, for instance, that if your site has a "regular" and "printer" version of each article, and neither of these is blocked with a noindex meta tag, we'll choose one of them to list. In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.

There are some steps you can take to proactively address duplicate content issues, and ensure that visitors see the content you want them to.
Use 301s: If you've restructured your site, use 301 

redirects ("RedirectPermanent") in your .htaccess file to smartly redirect users, Googlebot, and other spiders. (In Apache, you can do this with an .htaccess file; in IIS, you can do this through the administrative console.)

Be consistent: Try to keep your internal linking consistent. 

For example, don't link to http://www.example.com

/page/ and http://www.example.com/page andhttp://www.example.com/page/index.htm.

Use top-level domains: 

To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We're more likely to know thathttp://www.example.de contains Germany-focused content, for instance, 
than http://www.example.com/de or http://de.example.com.

Syndicate carefully:

 If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.

Use Webmaster Tools to tell us how you prefer your site to be indexed: You can tell Google your preferred domain (for example, http://www.example.com or http://example.com).

Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details. In addition, you can use the Parameter Handling tool to specify how you would like Google to treat URL parameters.

Avoid publishing stubs: Users don't like seeing "empty" pages, so avoid placeholders where possible. For example, don't publish pages for which you don't yet have real content. If you do create placeholder pages, use the noindex meta tag to block these pages from being indexed.

Understand your content management system: Make sure you're familiar with how content is displayed on your web site. Blogs, forums, and related systems often show the same content in multiple formats. For example, a blog entry may appear on the home page of a blog, in an archive page, and in a page of other entries with the same label.

Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.

Google does not recommend blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can alsoadjust the crawl rate setting in Webmaster Tools.

Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don't follow the advice listed above, we do a good job of choosing a version of the content to show in our search results.

However, if our review indicated that you engaged in deceptive practices and your site has been removed from our search results, review your site carefully. If your site has been removed from our search results, review our Webmaster Guidelines for more information. Once you've made your changes and are confident that your site no longer violates our guidelines, submit your site for reconsideration.

In rare situations, our algorithm may select a URL from an external site that is hosting your content without your permission. If you believe that another site is duplicating your content in violation of copyright law, you may contact the site’s host to request removal. In addition, you can request that Google remove the infringing page from our search results by filing a request under the Digital Millennium Copyright Act.
Read more ...