Google has two types of penalties; algorithmic and manual. Algorithmic penalties are automated and happen as a result of filters Google uses in its algorithm to remove low-quality pages from its search results. Recent updates to Google’s algorithm have been called Panda, Penguin and the-not-yet-implemented Penguin 2.0.
Manual penalties are applied as the result of a manual review, and Google will notify you if that happens. The dread that creates is kind of like getting a letter from the IRS.
Penalties can range from moving all or most of a website’s pages from the main index at Google to the Supplemental Results, or completely removing a URL from the results (blacklisting.) Many penalties simply result in a page not showing up in the first 50 or 100 search results, which is the equivalent of being invisible.
If a site (or an individual page) used to rank higher and has now dropped down or out, it might be the result of a penalty. It might also be the result of a site redesign that changes the page names (URLs). Both negative outcomes can be avoided.
Some websites violate search engine guidelines without realizing it or intending to. If a site has been penalized, it can be corrected to meet the quality guidelines that search engines support and can be submitted for reconsideration by Google, however this can be a long and arduous process.
Practices to avoid:
Buying or Selling Links
Here’s what Google says:
Buying or selling links that pass PageRank is in violation of Google’s webmaster guidelines and can negatively impact a site’s ranking in search results.” And “Not all paid links violate our guidelines. Buying and selling links is a normal part of the economy of the web when done for advertising purposes, and not for manipulation of search results. Links purchased for advertising should be designated as such. This can be done in several ways, such as:
Adding a rel=”nofollow” attribute to the <a> tag
Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file
Hidden Text or Hidden Links
Any text that is not visible in the browser window is considered “hidden” and will result in a penalty, perhaps even blacklisting which is complete removal from the index.
- Hidden Text: not visible to human visitors
- Text hidden in a layer or table cell behind a graphic
- Text that is the same color as the background it is on
- Text in a layer positioned off the visible browser window
- Text in a layer (or table) with a property of display: none or visibility: hidden
- Hidden Links: are links that are unreadable by human visitors and are meant for the search engine robots only
- The link is hidden text (as described above)
- CSS is used to make the hyperlink too small to see, as little as one pixel high
- The link is hidden in a small character like a hyphen or period in a sentence or paragraph
Other Link Problems
- Building too many links too fast
- Links from foreign language sites that have no relevance to your business
- Links from bad neighborhood websites (adult sites, gambling sites or other similar industries)
- Link exchange networks
For most small businesses there are only 3 legitimate reasons to have multiple domains.
- Purchasing the .net or .biz (or other TLD) version of their existing website is a great idea as long there is a 301 permanent redirect from that domain to their original domain. Do not create a duplicate website at the new URL. This is considered a “mirrored site” and may cause both URLs to be dropped from the search engines. CAUTION, purchasing multiple domain names stuffed with keywords just to use them to redirect to the original site is also penalty-worthy.
- Some companies have multiple businesses that may be related but not identical. When creating an additional site for a business, the content must be significantly different from any of their other websites or it will result in a “duplicate content” penalty.
- It is okay to have a duplicate website if it is on a domain from another country or if it is in a foreign language.
Low Quality or Low Quantity Content
In addition to copied or scraped content, the search engines will penalize a site for unintelligible, automated content that human readers can’t understand. Sites with very little content are considered “low value” sites and get filtered out of search results.
The Keyword meta tag is not an issue because the search engines disregard it. However, any word or phrase that appears on a web page multiple times is identified by the search engine as a keyword or key phrase for that page. If it is artificially included many multiple times in the page title, page URL, heading tags, page text, links, meta data, and ALT attributes, it can be considered “stuffing” and will be penalized. A well-optimized site will include keywords in all of the above-mentioned elements in moderation and in natural sounding language.
Cloaking or Sneaky Redirects
- Cloaking is the practice of presenting different content or even different URLs to the search engine spiders than what is presented to users. This includes serving up a regular HTML page to search engines while serving a page of images or Flash to end users.
- Doorway Pages are multiple pages that are each optimized for a different keyword or key phrase and then funnel users to a single page destination. These can be within one domain or deployed across multiple domains and in most cases they will be redirected using the Meta refresh command.
Obviously pages that are used for phishing or installing viruses, trojans or other badware will be heavily penalized.
For more information, see:
Search Content Quality Guidelines from Yahoo! Search