2833540510_3a3d273b8f

People are always interested to know how many more clicks they are likely to receive if they appear at position 1 of the search engine results pages for their chosen key-phrases. There are many different studies and statistics available about organic click-through rates on the internet, many of which are contradictory. This report details findings from some of the more well known studies and how much you can actually learn from them.

Study 1

Below is a chart detailing the data from the famous AOL data leak in 2006. Although old, this data is still often quoted as gospel.

2006 AOL Data Leak Chart

Study 2

This study was conducted by Neil Walker, a UK based SEO expert. Some Blog posts suggest Walker’s study is based on Webmaster Tools data across 2700 keywords. Walker himself claims that the data comes from a study of Webmaster Tools in 2010, the AOL data of 2006 and an eye tracking study conducted in 2004.

Organic Click Through Rate Study 2

Study 3

Another well known study conducted in 2010 was by Chitika, a data analytics company in the business of advertising. For their study, they looked at a sample of traffic coming into their advertising network from Google and broke it down by Google results placement.
Traffic by Google Result - Study 3

What can we actually learn from this?

Well, it is clear that if you are at position 1 in the search engine results pages, you are very likely to receive substantially more clicks. However, there are always exceptions to this rule.

A famous example:

For a long time, if you searched for ‘mail’ on Google, Gmail would come up at position 1 and Yahoo would come up at position 2. Still, Yahoo received in excess of 50% of the click-throughs. Studies indicated that this was because people searching for ‘mail’ were looking to login to their Yahoo mail account.

This example illustrates that if people are looking for something specific, they will not always click on position 1 if it doesn’t seem to offer what they are looking for. Another example is Wikipedia: they are often displayed in high positions for a wide range of phrases, but won’t always receive a high click-through rate because people aren’t always looking for general information.

In summary, at position 1 of the search engine results pages you are extremely likely to receive the most clicks, but exactly how many more than the lower positions is impossible to say. However, the figures in the studies detailed above can give a good indicator of what to expect. Search engine results positions and click-through rates will always be dependant on high quality SEO, your choice of key-phrases, and the area of business in which you operate.

2215253658_1d49567f0d

Most companies with a website are probably guilty of claiming on their website and internet marketing activities that they are ‘the best’ in their field. However this will be a risky claim to make from 1st March 2011 when the Advertising Standards Authority (ASA) extend the advertising rules around making these claims to include online advertising. Continue Reading

4648460016_e084f8baca

2010 was a busy year for search and particularly Google who have upped their game yet again. In this post we’ll review some of the major highlights for SEO in the last year.

New Ranking Factor Announced: Page Load Time

In April 2010 Google announced that the speed your page loads for a visitor is a factor considered in their ranking algorithm. Basically if your page loads slowly then you’ll drop down the rankings. Don’t panic though, its only likely to affect really slow loading sites but keep your eye on this factor and if you are in a very competitive market and your rankings are slipping this could be an area to improve.

The Bing-Yahoo Integration

In the USA the integration was complete and Yahoo’s independent index was retired thus giving Bing a leap in market share. Of course they are so far behind Google that they cannot be compared at all however Bing is now a competitor worthy of attention.

Google’s May Day (or Brand) Update

Google applies an extremely complex algorithm which takes into consideration hundreds and hundreds of different factors and they are constantly tweaking and improving it to ensure that the results presented are the best they can be. Sometimes there are more significant tweaks, or algorithm updates, that have a more dramatic affect. The May Day update resulted in many websites losing long tail traffic (up to 10 percent or more). The sites that suffered seemed to have a low number of deep links. The winners were “high quality” sites and big brands.

Google Caffeine

This was not an algorithm update although its often confused as one especially as it was introduced close to the May Day Update. Caffeine was an infrastructure change which related purely to speeding up the indexing system. Caffeine allowed new content to be indexed almost instantly rather than in batches which was slower.

Google Instant

In September Google Instant impacted the user experience in a very noticeable way by showing suggested search phrases as you typed your phrase into the search box. The idea being that if you saw what you wanted appearing in the list then you wouldn’t need to finish typing. It wasn’t just the suggestions popping up that was so noticeable it was the fact that all the results changed as you typed too – it could be quite distracting. Google Instant also impacted the long tail as more people gave up typing in longer phrases & went with the suggestions.

2506706124_0e9c3da720

Around a year ago, Google introduced their new database architecture, Caffeine. This changeover was done for several reasons, the first of which was to allow Google to continue to index all of the web in years to come, and the second of which was revealed last week: Instant.

The main issue with Instant, from Google’s perspective, is that it generates between 7-10 times the volume of searches per second than the previous version, as Google loads search result pages constantly as people are typing. With the expected rollout of this into browser bar-based searches (like the Chrome bar, the Google toolbar etc), this will almost certainly expand steadily from only appearing to logged-in users, to being the default state for Google.

Ten Blue Links?

So, the main upshot of the changeover to the Caffeine system is that it allows for vast amounts of real-time data to be added to the index almost as fast as it’s created. But what does this mean in terms of rankings?

Well, in short, it allows fresh data to be displayed to users much more rapidly. As a result, we’ve seen greater emphasis on results featuring video, location-based services, news items, personalised results and the like over the last year. This has had the effect of changing the strategy for SEO in certain industries, as it has created new avenues for search marketers to reach their intended audiences.

Instant Coffee Anyone?

A lot has been written about Instant over the last couple of days, some of it accurate, some of it less-so. To save time, I’ve compiled some basic takeaway points as to the nature of Instant, what it brings to the table, and how it affects SEO and PPC.

  • Does Google kill SEO? No, but it does change keyword research slightly, as marketers need to pay greater attention to the suggested keyword searches
  • Negative keywords need to be paid closer attention to in PPC, as a search for “U2 new” will return results for “U2 new album”, where a user might type their full query as “U2 new zealand tour dates”
  • PPC ad impressions will only count when:
    • the user clicks anywhere on the page after beginning to type a search query
    • the user chooses one of the predicted queries from Google Instant
    • the user stops typing and search results are shown for at least three seconds
  • The nuts and bolts of how SEO is conducted on-site and in linkbuilding hasn’t changed
  • The nuts and bolts of how PPC is conducted hasn’t changed either, although it is now pretty much the only good way of getting impression data for search volume numbers for keywords. Keyword tools will soon be relegated to being only useful for generating keyword ideas, not for estimating volume
5228173_7558daaf2e_z

To increase traffic to your website is of course a major part of an online marketing strategy but don’t overlook the importance of what your visitors might be experiencing when they get there. Afterall, high volumes of traffic are useless to you if these visitors are not converting. There can be many factors to consider when looking at low or falling conversion rates.

Little glitches or errors will frustrate the user and more than likely turn them away; like a search function that doesn’t work properly or a slow loading page. Worse still, a shopping cart that fails to add the products or forms that have errors.

There is one particular tool that you may find very helpful for looking at how your visitors are interacting with your site and in particular handling your forms. Have a look at www.clicktale.com

Broken Links between pages or image links that go to the wrong place will not inspire confidence in the potential customer.

Poor Quality images of products will not show them to their best advantage or entice people to buy.

These are just some of the reasons visitors may exit your site and seek out your competititor. It is important to have your usability testing done by someone who is not already familiar with the site as they will pick up navigation issues for example. A new user will really tell you how intuative or easy it is to buy from your site.

Remember too that webusers can be fickle and unforgiving as they scan for the results they want. They also have increasingly high expectations making them harder to satisfy with sub standard websites. Don’t make them search too hard or make them fill endless forms or they will quickly loose interest. Website usability testing finds out how much time users might take to search for a particular product or item on your site; what are the difficulties faced by them while conducting the search; do they face any problems related to your website design and features, such as navigating from one web page to the other, opening up links, downloading images or content, and the like.

In a nutshell, website usability testing is an approach to calculate the ease-of-use quotient of your site. Here at SIM we have perfected a system of usability testing that has been invaluable to our clients. Give us your target audience demographic and we can arrange for comprehensive testing of your site with a full report.

Based on the website usability testing analysis, you can then apply the changes and correct glitches and errors in the knowledge that you will be improving your customer’s experience.

Get in touch