Facts of the week: How to work around Google spam filters
Google tries to keep its search results as clean as possible. For that reason, they have a variety of spam filters in their ranking algorithm that try to remove low quality web sites.
If your web site gets caught by one of these filters, it will be very difficult to get high Google rankings. In this article, we're taking a look at the 15 most common Google spam filters and how you can get around them.
Duplicate content, false use of robots.txt and Google bowling
The duplicate content filter is applied to web pages that contain content that has already been indexed on other web pages. This can happen if you have multiple versions of the same page on your web site or if you use content from other sites on your own web pages.
If the content is already available on another page then it will be difficult to get high rankings for your own web page. If the same content is available on multiple pages then Google will pick only one of them for the search results. Having the same page more than once on your web site might also look like a spamming attempt.
False use of the robots.txt file is not exactly a Google spam filter but it basically has the same effect. While a robots.txt file can help you to direct search engine spiders to the right pages it can also lock out search engines from your web site if you use it incorrectly. Further information about the robots protocol can be found here.
Google bowling means that competitors use spammy SEO techniques to get your web site out of the search results. These people set up doorway pages with JavaScript redirects, blog spamming, referral spamming, etc.
Although your competitor has set up these spam pages that redirect to your web site, Google might think that it is you who is responsible for these spamming attempts and downgrade your web site. Google claims that external factors cannot influence your rankings on Google. However, some "black hat" SEO'lers offer services that can harm the rankings of your competitors.
How to get around these filters
If you have multiple versions of the same page on your web site (print version, online version, WAP-version, etc.) then make sure that search engines will index only one of them.
You can exclude special web pages from indexing by using a robots.txt file or the Meta Robots tag. IBP's web site optimization editor allows you to quickly add Meta Robots tags to your web pages.
Double check the contents of your robots.txt file to make sure that you don't exclude search engines by mistake.
If your web site has been hit by Google bowling then the only thing you can do is to file a reinclusion request.
The best way to get high rankings on Google and other major search engines is to use white-hat SEO methods: Optimize the content of your web pages and get high quality inbound links.
I hope you learned something today!
Enter promo code first1000 to save 70% off on your first month's subscription.
If you lack targeted traffic and sales you should grab this deal before it's gone!
First 1000 only!
Jason Lamure
Services
Search Engine Submission
SEO Analyzer
Instant Website Traffic
Guaranteed Top 10
Banner Exchange
SEO Tools
Website Monitoring
Link Popularity
Google PageRank™
Alexa Traffic Rank
Link Relevancy
Keyword Density
Ranking
Search Engine Coverage
Link checker
Meta Tag Creator
Keyword Creator
Top Keywords
Did you find this article informative?
<P>Hi, </P>
<P>Thanks for dropping in to my forum.</P>
<P>I hope you did enjoy the article.</P>
<P>Jason Lamure</P>
<P> </P>
<P> </P>
I found this article informative and helpful - (0)
I did not find this article to be informative or helpful - (3)
This poll has expired.
Total Votes: 3Please login to vote.