The Google Panda Algorithm
Here is an article I snagged off Search Day I think you may find interesting.
March 4, 2011
Why Google's Panda Algorithm Update Dropped Sites
Google's Matt Cutts and Amit Singhal have revealed some more about what is officially known as the Google Panda update, Google's latest algorithm update. There has been much speculation about why sites were dropped and others were promoted.
Make Google Trust Your Site
One key element is whether Google trusts your site. How does Google determine this?
It seems Google is using outside human raters, at least in part, as a form of quality assurance. Singhal said they asked raters questions like "Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?"
Cutts said other questions Google asked to establish trust include "Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?"
Some SEOs are interpreting this to mean sites overloaded with AdSense and others advertisements are now in Google's crosshairs.
Google has also previously mentioned that Google's Chrome site blocker extension could become a ranking signal. In rolling out the Panda update, Google said the spammy sites reported by Chrome users wasn't used, but it was used for comparative purposes. Google reported an 84 percent overlap with blocked and downgraded sites.
Some SEOs are speculating that Google may be looking at the ratio of above-the-fold content (word count) to advertising. Many of the sites punished by Google had more ads than useful content.
"Low Quality"
Singhal said the shallow content overload that had users complaining mostly came about due to their Caffeine update. But Google still is having trouble defining "low quality" - with Singhal saying Google still hasn't solved it.
Cutts said Google "came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons."
"Our classifier that we built this time does a very good job of finding low-quality sites," Singhal added. "It was more cautious with mixed-quality sites, because caution is important."
So big brands seem safe in Google's eyes -- potentially more bad news for the little guy on Google. Cutts also mentioned government sites ranking higher for medical searches.
Another hint if you want to figure out what not to do, according to Cutts: Look at Suite 101. Go there, look around, figure out what they're doing, and make sure you're doing the opposite.
Even though this hasn't solved the problem of low-quality content, Cutts and Signhal both think this update did what it was supposed to. Despite this, Google is accepting feedback.
Panda?
For those curious, the Panda update is named after a Google engineer. You can read the fullWired Q&A here.
Posted by Danny Goodwin on March 4, 2011 9:51 AM
Recent Comments
1
Thanks for throwing this up in your blog Wes.. I appreciate it.