Montag, 29. Januar 2007

Duplicate Content and SEO - Dropped Rankings

A duplicate website is a website that has many if not all of the same pages as another live website. So: this article is about - Duplicate Content and SEO and Dropped Rankings.

Why is a duplicate website such a bad idea?

The major search engines are constantly trying to improve the quality of their search engine results in an effort to provide the best quality content for users. When duplicate content is indexed by search engine spiders, valuable time and processing power is wasted.

As a result, search engines have blocked sites that used duplicate content from their database, ultimately favouring the site that either had the content first, or I believe, the one site that has the greater online history. In addition, the major search engines have a bad taste after dealing with so much duplicate content created by spammers over the past several years.

As a result, posting a duplicate website is an offense that can quite literally blacklist a domain; there are few things the search engine properties dislike more than being gamed by spammers.

How much of my page should be unique?
Is there a standard ratio or percentage you can share?

There is no industry standard formula but, if I had to state a percentage, I would say a minimum of 70% of the page should be completely unique to thwart any concerns of duplication. You may be able to get away with less than 70% unique content, but I would suggest this is playing with fire. Either way, this statistic is moot since every page you create needs to be created with the intention to provide a powerful resource; after all search engines are only a small part of the plan - you do need visitors to like what they see and buy your product or service!

Keine Kommentare: