Google algorithm change could hit legit e-commerce sites

11 Mar 2011

Following an algorithm update targeting content farms, international e-commerce vendors are being urged to update content on their sites because Google’s next algorithm change, code-named ‘Panda’, could affect their page rankings.

‘Panda’ is aimed at tackling content farms but could destroy the rankings of sites that Google users are sick and tired of seeing in the search engine results page, says search marketing firm Greenlight.

Although currently alive and kicking in the US, going by the trend of previous Google algorithm rollouts, it could, at any time within the next three months, hit UK sites and swiftly move beyond.

To avoid being slammed with little or no warning, leading search marketing specialist and technology firm Greenlight is urging businesses to take the necessary steps now to ensure their sites’ rankings, and thus visibility, are unaffected when Panda strikes.

What should businesses do to prepare?

To avoid any negative impacts, the content on websites should be well written. Businesses should aim to attract as many clicks as possible when ranking in Google, by optimising the message being put across to users with the page title, meta description and URL.

And once users land on the site, they should be kept happy through the provision of a rich experience, with as much supporting multimedia as possible, and clear options for where to go elsewhere on the site if the first landing page does not “do it” for them in the first instance.

“Regardless of what Google is doing, these are all the basic requirements for almost any online business, which get at the heart of what Google algorithm updates, and indeed SEO (search engine optimisation), are all about,” said Adam Bunn, director of SEO at Greenlight.

According to Greenlight, the most likely explanation is that Panda is a combination of more emphasis on user click data and a revised document level classifier.

User click data concerns the behaviour of real users, during and immediately after their engagement with the SERPs (search engine results pages). Google can track click-through rates (CTRs) on natural search results easily.

It can also track the length of time a user spends on a site, either by picking up users who immediately hit the back button and go back to the SERPs, or by collating data from the Google Toolbar or any third-party toolbar that contains a PageRank meter. This collective in all probability provides enough data to draw conclusions about user behaviour.

Using it, Google might conclude that pages are more likely to contain low-value content if a significant proportion of users display any of the following behaviours:

  • Rarely clicking on the suspect page, despite the page ranking in a position that would ordinarily generate a significant number of clicks
  • Clicking on the suspect page, then returning to the SERPs and clicking a different result instead
  • Clicking on the suspect page, then returning to the SERPs and revising their query (using a similar but different search term)
  • Clicking on the suspect page, then immediately or quickly leaving the site entirely

What might constitute “quickly” in this context? According to Greenlight, Google probably compares the engagement time against other pages of similar type, length and topic, for example.

“We know Google has strongly considered using user click data in this way. It filed (and was granted), a patent called method and apparatus for classifying documents based on user inputs describing just this. It is likely Google only uses this data heavily in combination with other signals as user click data as a quality signal, is highly susceptible to manipulation.

“Hence it’s historically being such a minor part of search engine algorithms,” said Bunn.

John Kennedy is a journalist who served as editor of Silicon Republic for 17 years

editorial@siliconrepublic.com