Google Panda is evolving. The search engine giant’s Panda will soon graduate from an occasional manual update to a continuous part Google’s algorithm as reported by Matt Cutts, Google’s Distinguished Engineer. What impact will this have on websites and how should one guard against the wrath of Panda?
According to Danny Goodwin of Search Engine Watch, Matt Cutts revealed the Panda news at the SMX West 2013 Conference during the panel, The Search Police:
Rather than having some huge change that happens on a given day, you’re more likely in the future to see Panda deployed gradually as we’re rebuilding the index, so you’re less likely to see these large scale sorts of changes.”
How will the constant Panda Algorithm impact websites?
Will this be good news or bad news for website’s hit by Panda’s fury? Goodwin states that once Panda is integrated into Google’s real-time algorithm, those sites that are hit due to “low-quality” or thin content may find:
it may be harder to detect in analytics due to this new integration with the main search algorithm, but it could mean a faster recovery.”
What does Google Panda do?
Google Panda is a major Google algorithm update that was rolled out in February of 2011. The objective of this algorithm was to filter “poor quality content from working their way into Google’s top search results” by demoting scraper sites that copied other website content and content farms that include shall or low-quality content from Google’s search results.
What action plans can you take to avoid the Panda’s wrath?
If you have been putting off spring cleaning on your website or blog, now is the time to take it off the back burner. A comprehensive action plan to take includes:
- Eliminate Duplicate Content
Duplicate content can be both intentional and unintentional. More than likely, it is the unintentional duplicate content that goes unnoticed by the website owner.Google Webmaster Tools is an invaluable resource to help determine if there are duplicate content issues on your website.
- Dashboard > Optimization > HTML Improvements
Once detected, there are tools to help eliminate the duplicate content including:
- Preferred domain – Set your preferred version of your domain name – www or non www in Google Webmaster Tools.
- 301 Redirects – Ethically redirecting one page to another.
- Robots.txt – Leaving the duplicate content, but blocking search engine bots from indexing the pages by adding disallow to the robots.txt. For example:
- Meta Robots – You can block search engine spiders from indexing duplicate pages with meta robots by adding the following to the head of the page:
- The Rel=Canonical Tag
This tag goes in the head area of a page similar to the Meta Robots and allows webmaster to specify the canonical URL of a web page. For example:
WordPress is an extremely popular Content Management System (CMS) that is used for websites and blogs. Unfortunately, it can also cause duplicate content in the search engines. Using a comprehensive tool like WordPress SEO by Yoast will help manage the duplicate content issues.
- Remove or Improve Low Quality Content
First, one needs to determine if a page is low quality or shallow content. Google has a comprehensive list of 23 questions to ask about content in What counts as a high-quality site? Instead of going through each and every web page on a blog or website, you might like to evaluate the site’s Google Analytics statistics and review those pages with:
- Pages with high bounce rate
- Pages with low or no visits
If you determine that a page has outdated content or is no longer relevant, you may like to consider implementing a 301 redirect to another page. Note: If you simply delete the page, your visitors will receive the dreaded “404” page not found.
If the page does offer value, then you might like to improve it by revising it by adding additional information to provide unique and valuable content. Google’s article specifically states:
One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.
- Produce Quality Content
Moving forward, it is clear that Google is looking for high quality. From the Webmaster Central Blog:
Our site quality algorithms are aimed at helping people find “high-quality” sites by reducing the rankings of low-quality content.
Several questions to ask yourself when creating content include:
- Does this article provide a complete or comprehensive description of the topic?
- Does this page or article provide relevant value?
- Does this article contain insightful analysis or interesting information that is beyond obvious?
- For the full list, please review Google’s More guidance on building high-quality sites.
Google Panda will be integrated into the search engine giant’s real-time algorithm and there are ethical techniques to ensure that your site is not hit. Your best line of defense against the wrath of the Panda’s rage is to follow best practices from cleaning up duplicate and low quality content to implementing high quality content standards moving forward and following Google’s guidelines to the letter.