Web Marketing Today

The Effect of the Panda Update on Your Content Strategy

In this video interview, SEO expert Arnie Kuenn explains how content marketing strategies have changed since Google’s ‘Panda’ update in February 2011 downgraded the rankings of many content-heavy sites, due to duplicate or poor quality content. Kuenn explains what factors are believed to be used by Google to detect poor content and what to do to generate better quality content.

Though no one knows Google’s exact algorithm to detect poor content, it is believed to include several factors:

1. Readability can be determined from the Flesch Readability Scale.

2. Bounce Rate is determined from how long on average a person stays on the page before hitting the back key. A short time indicates that the content wasn’t relevant to them. Of course, Google will know the average bounce rate for your industry. Your content needs to engage the reader: answer questions, provide humor, refer to other pages on your site, etc.

3. Ad Coverage, the ratio of content to ads is another factor in determining weak content. Many downgraded sites used content ‘scraping,’ involving a ‘bot’ that searches the web for content related to keywrods, and caopies that content to their site, adding no net value to the Internet, but creating lots of clutter.

4. Length. Though the recommendation used to be 250 word minimums, Kuenn now recommends a minimum of 600 words for articles and blog posts.

5. ‘Likes’ or links. Of course, the number of ‘likes’ or links or social media mentions also help Google judge the relevance and quality of a webpage.

Ultimate, content should be to add value to your customer — not just to boost your rankings. Just changing placenames in a real estate site won’t work; it can be spotted easily as poor and lazy content.

Tags:    

Dr. Ralph F. Wilson
Dr. Ralph F. Wilson
Bio  |  RSS Feed


Email Updates

Sign up to receive our email newsletter
And receive a free ebook
50 Great Local Marketing Ideas