Wednesday, February 25th, 2009
Duplicate Content has recently become a hot topic within the SEO community, with the 3 search engine “giants” (MSN, Yahoo! and Google) proactively filtering out similar search results in a quest to present the user with more relevant and distinct web pages.
Would you know if your site rank is suffering?
Duplicate Content refers to substantial blocks of similar web page content, either site-wide or across domains. This is a commonly used technique in search engine spam – You probably see this regularly; redundant websites full of keywords and links in an attempt to deliberately trick search engines into returning low quality results.
It’s good that people like Google are trying to give us a better user experience with original and fresh content, but sometimes hard working webmasters can unknowingly spam the search engines.
A common example is a dynamic E-commerce website which queries a database. The website may have multiple URLs which are in effect just different routes of accessing the same content: a product (which usually includes a copy & paste manufacturers description also found on any other site selling the same item), or the URL may contain a unique session id passed through the query string.
On February 12 2009, the 3 major search engines introduced a new microformat which lets the search engine know which URL you think is the “canonical” or “proper” version. In effect you are telling the search engines “this page is the most useful amongst those with duplicate content”. They will then consolidate link popularity into that single URL
<link rel="canonical" href="http://www.example.com/product/1234/" />
Although a step in the right direction for some types of site, the new microformat doesn’t completely address the problem of duplicate content. And whilst duplicate content isn’t grounds for search engines to take action unless it appears to be manipulative and intented to decieve them, there’s a few steps webmasters can take to ensure users are seeing their content.