Does the Google Panda Update Spell the End for Duplicate Content?

Image via Wikipedia

Remember the Google algorithm, now nicknamed ‘Panda’ which was rolled out in March to the USA and now last month to the rest of the world? The implementation of the algorithm has caused much interest in internet marketers and some consternation in the websites it has affected but now results are beginning to come in to show how much websites have been affected by the changes.

Changes Caused by the Update

If the USA version of the change had one effect, it instantly knocked back articles by Suite101 and Hubpages. These websites have traditionally been used by internet marketers to backlink to their websites to enable them to rise in the search engine rankings. The problem was that although these websites were moderated and edited, the articles on them went back to the beginnings of the website. There needed to be some changes and both websites have implemented stricter edits and have begun to delete older and perhaps less useful articles. The change did not affect all websites however such as Ehow which also contained a lot of content.

The version rolled out to the rest of the world showed that Google had listened. Ehow lost some of its traffic, but so did UK tech websites and price comparison websites. Ebay improved their ranking as did UK newspaper websites.

Improving Search Results

The stated aim of the update was to improve the Google search and eliminate duplicate content. For years internet marketers have suggested that the best way to get quick results have been to submit the same or very similar articles to article websites such as Ezine and Technorati in order to gain back links. The articles can be submitted for free and are allowed links. Members of these websites can copy these articles for their own use and therefore spread the links over the internet. The problem is that many of these articles are simply copied, spreading duplicate copy over the web.

Using these articles allowed website owners to populate their websites quickly with copy for the price of a link and it seemed a win-win situation. The problem occurs when a user searches for information and the same article on several different websites is returned.

Say No to Duplicate Content

Duplicate content has never been a good idea. There is so much rubbish out there on the internet that it can be hard to find the right information that you are looking for. Google is going to continue to update the algorithms and try to eliminate websites that it perceives as being nothing less than spam.

The message is clear: by all means add content to your website. Regular, relevant content added to a website will help it to rise in the search rankings but make sure that it is original and useful information in order to ensure that your website is not hit by further updates.

 

Share

Google Algorithm Change Rocks Content Websites

Image via CrunchBase

Last week, the biggest search engine on the internet at the moment announced a change to the algorithm it uses to rank articles when responding to search criteria. The change was meant to help to weed out duplicate and low ranking articles on content websites, sometimes referred to as ‘content farms’. The idea is that more relevant content will be brought to the top of a search listing which should help to reward more useful articles and blogs.

This caused speculation as to which of the content websites are being targeted in this way. Suite101 and Hubpages, which are both revenue-share models, are two websites which have seen some of their articles fall from grace, although whether this is due to the implementation of the algorithm, who can tell? On the other hand, Ehow articles seem to have benefitted from the changes.

Demand Media owns Ehow and is considered by some commentators to be a content farm although it pays for articles upfront. The company has responded that it is not being singled out for this treatment and that of the articles it has, some are benefitting while others are suffering. The company suggests that it will pay more heed to what consumers want in the future.

In the early days of internet marketing, web marketers threw up websites with a little content and a lot of adverts. These websites were of little informational value; they were unashamedly there to encourage visitors to click on the ads and no more. Now consumers demand more of their websites: they use the internet regularly for information and expect the results returned by their searches to be of good quality and useful. The search engine could argue that it is responding to consumer demand and trying to crack down on low quality websites.

I think that this will even out in time. The results will show an initial dip and the companies affected will have to take steps to protect their page rankings and ensure that the content that they produce is of a good enough quality to count as useful information. However, Google has a symbiotic relationship with these websites. They use its adverts as part of their revenue share model and to destroy them completely would not be in the search engine’s best interests. It is therefore probably more likely to be a warning shot to the companies to encourage the production of well-written information rather than lazy, badly constructed content. This can only improve the internet for everyone concerned.

For more reading, check out this article.

 

 

  • Who got knocked in Google’s algorithm update? (tech.fortune.cnn.com)
  • Google Search Algorithm Update Against Content Farms and Low Quality Website (shoutmeloud.com)
Share
Verified by ExactMetrics
Verified by MonsterInsights