1856663523_cffa76bfbc

We speak to a lot of Clients who don’t realise that it is extremely important for their site navigation, (commonly referred to as internal links or information architecture) to be extremely well considered so that the right pages get indexed easily and regularly by the search engine spiders. Connected to the site architecture is the preference that no one page contains more than 100 links, this keeps the quality score assigned to each link at a respectable level and helps the spiders move through the site properly.

Crawl Priority

To start, it helps to understand how the spiders prioritise the pages, and then crawl the site.

Spiders will visit popular pages more often, popular pages are defined by the number of back-links and the site architecture should correlate with this. For example:

  • Your homepage, and chosen landing pages, should be the most popular with the most back-links
  • First and second level category pages should be fairly popular but containing less back-links than the homepage
  • At the bottom of the priority are the deepest pages, these will be pages such as news pages, product pages, service price lists etc

The spiders will enter the site via a landing page, this doesn’t need to be the homepage, they will then follow links through each page looking to index the whole site. They don’t like being sent in circles and they don’t like feeling lost in too many links, so it’s important that your site architecture makes it as easy as possible for the spiders to do their job, whilst getting all the pages which need indexing, indexed. Ideally you want the spiders to be able to index everything within three clicks of arriving on the site, regardless if that is your homepage or your deepest category page.

XML Site Maps

XML site maps are seen as the quick fix for architecture issues, and this is what they are. They do not resolve problems in the site architecture and internal navigation, they merely hide the problems so that you are unaware of them.

In an ideal world, you would not add an XML site map until you know the website architecture is sound and secure and most importantly indexing on it’s own. Below are some basic architecture tips to get you started.

Keep Architecture Flat

You want to keep your architecture as flat and easy to navigate as possible, whilst retaining the three click rule (if a spider lands on one of your deeper pages, can they reach the other pages within three links?)

In a brand new website the following structure is a common one used with the 100 links per page being the absolute maximum you should have on each page.

At the top: Homepage with no more than 100 links per page
First Level: Categories – no more than 100 pages (each page has no more than 100 links)
Second Level: Sub-Categories – no more than 10,000 (each page has no more than 100 links)
At the bottom: Detail/Products – no more than 1,000,000 pages

Index and rankings are determined by how much authority each page has, the higher the domain authority of your site the more links you can realistically get away with including on each page. As a rough guide, if your website already holds some domain authority (DA) you can increase the links on each page as follows:

DA 7-10 = 250 links
DA 5-7 = 175 links
DA 3-5 = 125 links
DA 0-3 = 100 links

So, the smaller the number of links the spiders have to follow to index the whole site, the happier they are and the more weight each page will hold.

Faceted Navigation

This is a common and useful aspect of ecommerce sites, which allows you to pick facets of a product which are important to you. For example, you could pick the category of T-Shirts, pick the colour black, and the size Medium, the results you are shown then directly correspond with what you specifically want. In essence the website has ignored anything which doesn’t contain the facets you have chosen.

Setting up faceted navigation can be tricky, and you need to keep in mind that the primary facet pages won’t rank, you want the deeper facet pages to rank as these are the one’s that will help the spiders discover all of the product pages.

When setting up faceted navigation, some of the things to keep in mind are:

URL

You must have a unique URL for each facet level. The URL’s should be clear and not complicated and hard to follow:

Clear URL: www.tshirtdomain.co.uk/tshirts/black/medium
Unclear URL: www.tshirtdomain.co.uk/all/tshirts/all/black/all/medium

You also want to ensure that whatever route somebody takes to reach this facet level the same URL is shown so for example:

Somebody clicks on Tshirts, then Medium, then Black the URL they end up on should still be www.tshirtdomain.co.uk/tshirts/black/medium and not www.tshirtdomain.co.uk/tshirts/medium/black which would result in you creating unnecessary duplicate content issues!

Adding & Removing Facets

You should make it easy for your customers to add or remove additional facets as they see fit.

As they add facets to their search these should be displayed as follows so that any or all facets can be removed by the user:

Tshirts [remove]
black [remove]
medium [remove]

So that they can easily choose which facets can be automatically generated from the results meta data so it is easy for you to display the number of results within that facet, for example:

Blue [35]
Green [23]
Yellow [1]

No Index

Any pages which could be considered as duplicate content should be no-indexed, the spiders will still visit these pages but they won’t index them. To keep a page out of the index you want to add some code to the page as follows:

<meta name = “robots” content = “noindex”> – This will make the page no index
<link rel = “canonical” href = “domainname.co.uk/tshirts/black”> – This will take the spiders back to the correct page.

Filtering & Pagination

Another common aspect of ecommerce sites is filtering results. This is where you can choose a filter which will sort the products in a certain way, for example only showing 10 items per page (creating pagination or multiple pages), or showing lowest priced items first.

The ideal way to deal with pagination in category results is to programme the page to show all results rather than writing each page of results as page 1, page 2, etc.

Once the main category page has been created you can then use javascript to create the pagination. Search engine spiders don’t follow javascript so you don’t risk duplicate content from having multiple pages under each content, but all of the products are indexed.

Plan, Plan, and Plan Again

Don’t under-value the benefit of properly planning your website. Most of our examples have referred to ecommerce sites, but the same principal applies to brochure sites. Plan to succeed and your website will be a spider’s navigational dream and you will be rewarded with good search results and no duplicate content issues.

In summary, the number one rule for you to keep in mind when you are planning your navigation is that you want as few pages as possible to be indexed, whilst allowing for each and every product page to be indexed.

2833540510_3a3d273b8f

People are always interested to know how many more clicks they are likely to receive if they appear at position 1 of the search engine results pages for their chosen key-phrases. There are many different studies and statistics available about organic click-through rates on the internet, many of which are contradictory. This report details findings from some of the more well known studies and how much you can actually learn from them.

Study 1

Below is a chart detailing the data from the famous AOL data leak in 2006. Although old, this data is still often quoted as gospel.

2006 AOL Data Leak Chart

Study 2

This study was conducted by Neil Walker, a UK based SEO expert. Some Blog posts suggest Walker’s study is based on Webmaster Tools data across 2700 keywords. Walker himself claims that the data comes from a study of Webmaster Tools in 2010, the AOL data of 2006 and an eye tracking study conducted in 2004.

Organic Click Through Rate Study 2

Study 3

Another well known study conducted in 2010 was by Chitika, a data analytics company in the business of advertising. For their study, they looked at a sample of traffic coming into their advertising network from Google and broke it down by Google results placement.
Traffic by Google Result - Study 3

What can we actually learn from this?

Well, it is clear that if you are at position 1 in the search engine results pages, you are very likely to receive substantially more clicks. However, there are always exceptions to this rule.

A famous example:

For a long time, if you searched for ‘mail’ on Google, Gmail would come up at position 1 and Yahoo would come up at position 2. Still, Yahoo received in excess of 50% of the click-throughs. Studies indicated that this was because people searching for ‘mail’ were looking to login to their Yahoo mail account.

This example illustrates that if people are looking for something specific, they will not always click on position 1 if it doesn’t seem to offer what they are looking for. Another example is Wikipedia: they are often displayed in high positions for a wide range of phrases, but won’t always receive a high click-through rate because people aren’t always looking for general information.

In summary, at position 1 of the search engine results pages you are extremely likely to receive the most clicks, but exactly how many more than the lower positions is impossible to say. However, the figures in the studies detailed above can give a good indicator of what to expect. Search engine results positions and click-through rates will always be dependant on high quality SEO, your choice of key-phrases, and the area of business in which you operate.

2215253658_1d49567f0d

Most companies with a website are probably guilty of claiming on their website and internet marketing activities that they are ‘the best’ in their field. However this will be a risky claim to make from 1st March 2011 when the Advertising Standards Authority (ASA) extend the advertising rules around making these claims to include online advertising. Continue Reading

4648460016_e084f8baca

2010 was a busy year for search and particularly Google who have upped their game yet again. In this post we’ll review some of the major highlights for SEO in the last year.

New Ranking Factor Announced: Page Load Time

In April 2010 Google announced that the speed your page loads for a visitor is a factor considered in their ranking algorithm. Basically if your page loads slowly then you’ll drop down the rankings. Don’t panic though, its only likely to affect really slow loading sites but keep your eye on this factor and if you are in a very competitive market and your rankings are slipping this could be an area to improve.

The Bing-Yahoo Integration

In the USA the integration was complete and Yahoo’s independent index was retired thus giving Bing a leap in market share. Of course they are so far behind Google that they cannot be compared at all however Bing is now a competitor worthy of attention.

Google’s May Day (or Brand) Update

Google applies an extremely complex algorithm which takes into consideration hundreds and hundreds of different factors and they are constantly tweaking and improving it to ensure that the results presented are the best they can be. Sometimes there are more significant tweaks, or algorithm updates, that have a more dramatic affect. The May Day update resulted in many websites losing long tail traffic (up to 10 percent or more). The sites that suffered seemed to have a low number of deep links. The winners were “high quality” sites and big brands.

Google Caffeine

This was not an algorithm update although its often confused as one especially as it was introduced close to the May Day Update. Caffeine was an infrastructure change which related purely to speeding up the indexing system. Caffeine allowed new content to be indexed almost instantly rather than in batches which was slower.

Google Instant

In September Google Instant impacted the user experience in a very noticeable way by showing suggested search phrases as you typed your phrase into the search box. The idea being that if you saw what you wanted appearing in the list then you wouldn’t need to finish typing. It wasn’t just the suggestions popping up that was so noticeable it was the fact that all the results changed as you typed too – it could be quite distracting. Google Instant also impacted the long tail as more people gave up typing in longer phrases & went with the suggestions.

2506706124_0e9c3da720

Around a year ago, Google introduced their new database architecture, Caffeine. This changeover was done for several reasons, the first of which was to allow Google to continue to index all of the web in years to come, and the second of which was revealed last week: Instant.

The main issue with Instant, from Google’s perspective, is that it generates between 7-10 times the volume of searches per second than the previous version, as Google loads search result pages constantly as people are typing. With the expected rollout of this into browser bar-based searches (like the Chrome bar, the Google toolbar etc), this will almost certainly expand steadily from only appearing to logged-in users, to being the default state for Google.

Ten Blue Links?

So, the main upshot of the changeover to the Caffeine system is that it allows for vast amounts of real-time data to be added to the index almost as fast as it’s created. But what does this mean in terms of rankings?

Well, in short, it allows fresh data to be displayed to users much more rapidly. As a result, we’ve seen greater emphasis on results featuring video, location-based services, news items, personalised results and the like over the last year. This has had the effect of changing the strategy for SEO in certain industries, as it has created new avenues for search marketers to reach their intended audiences.

Instant Coffee Anyone?

A lot has been written about Instant over the last couple of days, some of it accurate, some of it less-so. To save time, I’ve compiled some basic takeaway points as to the nature of Instant, what it brings to the table, and how it affects SEO and PPC.

  • Does Google kill SEO? No, but it does change keyword research slightly, as marketers need to pay greater attention to the suggested keyword searches
  • Negative keywords need to be paid closer attention to in PPC, as a search for “U2 new” will return results for “U2 new album”, where a user might type their full query as “U2 new zealand tour dates”
  • PPC ad impressions will only count when:
    • the user clicks anywhere on the page after beginning to type a search query
    • the user chooses one of the predicted queries from Google Instant
    • the user stops typing and search results are shown for at least three seconds
  • The nuts and bolts of how SEO is conducted on-site and in linkbuilding hasn’t changed
  • The nuts and bolts of how PPC is conducted hasn’t changed either, although it is now pretty much the only good way of getting impression data for search volume numbers for keywords. Keyword tools will soon be relegated to being only useful for generating keyword ideas, not for estimating volume
4463360814_6bf6c38cbf

The credit crunch isn’t going to dent the volume of sales made online this Christmas, in fact its more likely to bring increased sales as people search for bargains.  The season starts early on the internet so now is the time to ensure your site is ready for all those lovely visitors.

Here are some tips to get your website ready for Christmas:

  • Have Special Offers on your home page – not too many to overwhelm people but enough for them to see that these are great deals.  Think of offering loss-leaders and products that will encourage sales of additional products.
  • Have Featured Products on your home page – these get indexed by the search engines quickly and attract more attention from browsing visitors.
  • Make your Delivery Policyvery clear – free delivery under X £££s and say how many days it will take to arrive. At this sensitive time of year make it very clear what is your last date for ordering to ensure delvery by Christmas
  • Can your Images be improved? Great images sell product, its a fact!
  • Make sure shoppers can easily find your Returns Policy – it will give them confidence that you have their interests at heart.
  • Make your shop window Seasonal – the same way that shop windows in high streets are decorated for Christmas your website shop front can do the same to inspire that seasonal spirit.
  • Customer Service is essential, especially as the big day approaches – make sure people feel they can get the support they need.  A phone number is ideal and online support is also effective for demonstrating that your website is attended.  We recommend Provide Support for this.
  • It may sound obvious but do your visitors know what you sell?  I have seen many ecommerce websites that sell such an eclectic mix of products that its really not clear where they are positioned.  You have only 3-5 seconds to make a good impression and grab your customer’s attention so don’t confuse them with anything – keep it really simple.

Good luck for the coming season – CapGemini recently predicted that online sales in the UK will increase by 60% in the three months up to Christmas so this is definitely the time to be paying attention to your online marketing strategy.  Give us a call on 0845 838 0936 if you are interested in driving more traffic to your website this season – there is a very good chance we can help you quickly.

3150549850_45b48486aa

To gain top listings in the search engines it helps if you understand how the search engines work with your website and how they determine which websites get to appear at the top.

Search engines uses robots (other names are spiders or crawlers) which are programs that visit websites by following links from one page to another. When a robot visits a page it will take a copy of that page and put it in the search engine’s own database (this process is known as caching a page). Once in the database the search engine will apply their algorithm to tag or index the page so that its position in the SERPs for any given search term can be determined quickly.

Website pages need to be indexed regularly in order to stand any chance of performing. Each page on a website that is indexed can be produced in the SERPs so every indexed page should be considered to be a potential landing page. You can easily block any page that you do not want to get indexed and concentrate efforts on optimising important landing pages.

The algorithms used by the search engines take into consideration many different factors (Google has about 200 factors) when they evaluate a page and decide where it should appear in their results pages. SEO considers each page on a website and applies the best fit to get the best ranking for that page amongst its competition. However, competition for different phrases is not equal and so the same thing that is successful for one page is not necessarily effective for another page.

Search engines want to provide the best and most relevant results for their users and so our efforts are concentrated on meeting the demands of the algorithms through white hat methods. With Google having up to 85% of the UK search market we tend to favour Google’s algorithm which in turn favours good quality content combined with quality, relevant inbound links. These clean strategies also feed Yahoo’s and MSN’s algorithms too.

For competitive market places it is harder to get top rankings for generic product and service related search terms because other websites have fought hard to get to the top and will defend their position against newcomers. With the right strategy, over time, top listings can be achieved but we would argue that the value is in the conversions achieved and conversions can come from a much wider source than the immediately obvious generic phrases.

Approximately 80% of searches performed every day are unique and many use 3, 4 or 5 word phrases, they are quite specific phrases and so they tend to convert well – this is commonly referred to as the “long tail” of search and its value should not be underestimated.

Our approach to optimising our client’s websites be to develop a two-pronged strategy that aims for the top competitive search engine listings and also top listings for a wide variety of long tail search phrases.  We know that the bottom line for our clients is to make more money so that is what we help them do.

If you are interested to know more about how the search engines are working with your own website you can conduct your own mini website SEO audit. Follow the simple steps and you will find the answers to the questions below, you can also so the same with your competitor sites and see what you can learn to improve your own site.

Is my site indexed by the search engines?
How many links do I have pointing to my site?
What are my Search Engine Ranking Positions (SERPs)?
What do my meta tags look like?

You can do SEO on your website yourself and we encourage clients to have an understanding of how we work but doing your own optimisation is much like fixing your own roof – you can buy the tools, read a book and have a go… it will cost you time and energy and it may get the result you want, or you can leave it to the professionals and go do something else that is a better use of your own time.

6814239829_6f21ef5650

When millions and millions of searches are performed each day its probably surprising to find that 85% of the search terms entered in the search engines are unique. These unique search terms are gold for ecommerce websites and good optimisation will target the broad range of unique searches as well as the shorter more popular terms. The aim of a long tail search engine optimisation strategy is to increase conversions by being in front of visitors at the point they are ready to buy.

How Consumer Search Behaviour Works

Let’s take the example of Jason who wants to buy a lawn mower and consider his search behaviour as an average online consumer. First Jason might search for “lawn mowers” to research the options that are available. “Lawn mowers” is a popular search term and it would be great to hold that number 1 position for that search term but its most likely that Jason is not ready to buy yet, he is just researching. He won’t just look at one website, he will look at at least 2 or three sites and he’s looking to gather information in order to refine his search.

His next search might be to add another word to his search such as “electric lawn mower” or “petrol lawn mower” and he may be looking for reviews at this point. More and more consumers are turning to user-generated content in the form of reviews and recommendations in order to inform their decisions. Jason’s research leads him to favour a particular brand and so his search term becomes “Qualcast electric lawn mower” so that he can consider the different models and which would best suit his needs.

The final stage of Jason’s search is when he has decided exactly which model he wants and this is the most important search because now he is ready to buy. His search term is “Qualcast Suffolk Punch electric lawn mower”, this is a long tail search because it contains more than 3 words and is not a popular search term that is used often – it is gold to the supplier who comes top of the search engines for this term at the point Jason wants to buy.

Think about your own search behaviour, isn’t this fairly typical of what you do yourself?

Steps you can take to optimise for the long tail:

As a site owner, whether you have an ecommerce website or a brochure site you can benefit from your long tail visitors. Your ecommerce sites should have every level of pages optimised including the deep product pages – ensure you have a mix of specific and generic search terms and that your titles and descriptions are unique. On-page descriptions should also be meaty and interesting with lots of relevant information and benefits and of course a competitive price.

At Strategy we research the “money phrases” for your site because this is where you will make most money. We want to get you increased traffic at every level of your site and above all increased conversions – check out our Pay-Per-Results Search Engine Optimisation Services which include usability studies to increase sales.

3527166081_8aa590bf6c

There are some exceptionally cool strategic tools available that can give your organisation the edge over your competition.

Imagine you have a business-to-business website…

How would you like to know exactly which organisations are visiting your site? What they searched for? Which pages they visited, Whether they have visited before etc etc.  Yes, its very much the kind of general web stats information that you can get from most free or cheap website packages but let me repeat this because its quite important…. you can see exactly which organisations visited your site!

If you were a mainframe computer software company would you want to know that a major European bank was searching for repository solutions?  How would your sales staff feel if they could receive an email instantly telling them of the visit – would they not jump all over that potentially red hot lead?

Maybe your business supplies security products and a major player is checking you out – wouldn’t you want to know immediately?

This kind of strategic business intelligence can give you the edge over your competitors as well as providing valuable feedback about the visitors you are reaching so you can improve your site to make it more attractive.

Its not necessarily valuable for all business to business companies as only the bigger players can be identified as specific organisations.  However, if these are your target audience then you might want to check it out – it could be the strategic advantage you are looking for.

Get in touch