To gain top listings in the search engines it helps if you understand how the search engines work with your website and how they determine which websites get to appear at the top.
Search engines uses robots (other names are spiders or crawlers) which are programs that visit websites by following links from one page to another. When a robot visits a page it will take a copy of that page and put it in the search engine’s own database (this process is known as caching a page). Once in the database the search engine will apply their algorithm to tag or index the page so that its position in the SERPs for any given search term can be determined quickly.
Website pages need to be indexed regularly in order to stand any chance of performing. Each page on a website that is indexed can be produced in the SERPs so every indexed page should be considered to be a potential landing page. You can easily block any page that you do not want to get indexed and concentrate efforts on optimising important landing pages.
The algorithms used by the search engines take into consideration many different factors (Google has about 200 factors) when they evaluate a page and decide where it should appear in their results pages. SEO considers each page on a website and applies the best fit to get the best ranking for that page amongst its competition. However, competition for different phrases is not equal and so the same thing that is successful for one page is not necessarily effective for another page.
Search engines want to provide the best and most relevant results for their users and so our efforts are concentrated on meeting the demands of the algorithms through white hat methods. With Google having up to 85% of the UK search market we tend to favour Google’s algorithm which in turn favours good quality content combined with quality, relevant inbound links. These clean strategies also feed Yahoo’s and MSN’s algorithms too.
For competitive market places it is harder to get top rankings for generic product and service related search terms because other websites have fought hard to get to the top and will defend their position against newcomers. With the right strategy, over time, top listings can be achieved but we would argue that the value is in the conversions achieved and conversions can come from a much wider source than the immediately obvious generic phrases.
Approximately 80% of searches performed every day are unique and many use 3, 4 or 5 word phrases, they are quite specific phrases and so they tend to convert well – this is commonly referred to as the “long tail” of search and its value should not be underestimated.
Our approach to optimising our client’s websites be to develop a two-pronged strategy that aims for the top competitive search engine listings and also top listings for a wide variety of long tail search phrases. We know that the bottom line for our clients is to make more money so that is what we help them do.
If you are interested to know more about how the search engines are working with your own website you can conduct your own mini website SEO audit. Follow the simple steps and you will find the answers to the questions below, you can also so the same with your competitor sites and see what you can learn to improve your own site.
Is my site indexed by the search engines?
How many links do I have pointing to my site?
What are my Search Engine Ranking Positions (SERPs)?
What do my meta tags look like?
You can do SEO on your website yourself and we encourage clients to have an understanding of how we work but doing your own optimisation is much like fixing your own roof – you can buy the tools, read a book and have a go… it will cost you time and energy and it may get the result you want, or you can leave it to the professionals and go do something else that is a better use of your own time.