While paid search listings and pay-per-click advertising have become more popular, organic search listings still outperform their paid counterparts in terms of generating clicks, writes Ciara O'Brien, Sunday, November 04, 2007 - Sunday Business Post
By their very nature, most businesses prefer to be leading the pack, and when it comes to being found online, there's no exception to the rule. With search engines still accounting for a large proportion of internet traffic, appearing at the top of the search results is a goal for which most businesses are striving.
According to Jupiter Research, 82 per cent of internet traffic arrives at a website via a search engine, with both paid search and organic listings used.
The remaining 18 per cent is made up of clicks on an e-mail link, website ads, direct URL address entry and website links. “The search engine should be the focal point of any net marketing,” said John Coburn, Praxis Now.
Deciding which search engines to target is also an important decision. There are about three or four main ones out there and, of these, Google is undoubtedly the most popular. Getting on to the front page of Google is akin to the Holy Grail. While paid search listings and pay-per-click advertising has certainly become more popular in recent times, organic search listings still outperform their paid counterparts in terms of generating clicks.
Unfortunately, it's not always a simple process to work out exactly what works and what is simply wasted time in improving your organic search listings. The exact formula used to work out where your website appears in the listings is a closely guarded secret.
“Search engine optimisation is roughly 70 per cent science and 30 per cent second guessing,” said Coburn. “Google and the other search engines do not declare their placement algorithms. If they did, they would be abused by search index spammers.”
As a result, firms should choose their SEO partners carefully. “No one can promise you first page placement,” said Patrick Bates, managing director of Webtrade. “There is no absolute guarantee, but there are an awful lot of good practices that can improve your ranking.”
Page level optimisation
Content is still the number one factor in determining your site's search engine ranking. How content is laid out will impact on your search engine ranking, along with how visible it is, where certain keywords are placed - if at all - and the different tags that you use throughout the webpage.
Many firms optimise their homepage, but seem to forget about the other pages that their site is made up of. “Google indexes pages rather than sites,” said Michael Heraghty of Heraghty.net. This means that visitors can come to your site through any page, and as such, these should all be designed with both Google and the user in mind.
“I think the key thing is to understand the language of your customers,” said Laurence Veale, senior analyst at IQ Content. “Mine information from your website log files to see what phrases people are using to find you, rather than internal jargon. This will go a long way to improving your search engine rankings.”
Firms may also have to rethink how they are writing their content, Bates said, as what works offline may not necessarily set the online world alight. This could be anything from including the geographic location in the homepage just to drive home the point that the firm is operating in Ireland, something that would probably not be a consideration in a print brochure.
“The key thing is relevancy that is consistent through the site,” said Bates. The use of certain key phrases in text, page titles, HTML links and other areas of the site will all help boost your Google search engine standing.
Search engines are already offering webmasters tools to help them achieve a higher page ranking. These include Google Webmaster Tools, which is free and allows you to see how your site is performing and make it more Google-friendly. Google Analytics, meanwhile, will help site managers learn more about where their visitors come from and how they interact with the site.
KISS - keep it simple, stupid - is one piece of advice that web designers would do well to heed. “Complex GUIs (graphical user interface) hinder the crawler's ability to access the site and index the pages,” said Martin Murray, chief executive of Interactive Return.
Websites that are graphic-heavy may look pretty, but unfortunately, web crawlers for the search engines can't actually read images - only text. So all those pretty logos and toolbars may come to nothing if your site can't be found because of them.
Also prone to causing problems for automated crawling programs are languages such as JavaScript. Because the bots cannot ‘see' what the JavaScript generates, it tends to ignore it altogether - not a good result if all your navigation is contained in this code.
“If you can't cut and paste the site content into a word document, chances are a search engine can't read it,” said Heraghty.
There are ways round this, however. You can utilise the ‘alt' tags to place text that the web crawlers can read in the navigations bars. You can also use a number of html links throughout the site's content to get as many cross-links between site pages as possible.
Removing code bloat is another tactic that could help. Once again, the simpler the site is for the web crawler to read, the better, and unnecessary code is one thing that could delay the process. Even something as simple as using external CSS (Cascading Style Sheets) can help your site climb up the rankings.
You can also cut down on the volume of code the search engine spiders need to wade through by putting all JavaScript code in external include files.
“By removing actual code from the source code of your pages you will again cut down on code bloat and make your pages easier to navigate for the search engines,” said Murray.
Keywords
Once considered the only way to get on search engine listings, abuse by keywords spammers means that keywords, while still impacting on search index placing, are no longer as important as they once were. Google never relied solely on keywords to gauge a website's value, as it was so open to abuse.
Avoid cramming as many keywords into your metadata as possible if they have no relevance to your site's content. This is where a lot of firms make a mistake.
“Aim for approximately 200-300 words of relevant content per page,” said Murray. Equally important, he said, is that within the text are keyword-rich terms. These should be placed in the page header, ie the blue title bar that tops off the page; as text links and alt text; and within the navigation in order for the Google crawlers to recognise the phrases.
Links
“Your links score is a function of a number of things,” said Coburn. “It used to be heavily weighted to the volume of inbound links. Now there is substantial emphasis placed on the quality of the inbound links.”
Getting trusted inbound links to your website from a quality source will push your search engine ranking up the page. “Each link is a vote for your website,” said Heraghty.
It's less skilful than optimising your page's content for search engines while still retaining the interest of your customers, but getting these links is much more difficult than you would think. After all, it would be quite hard to convince your competitors to place a link to your website on their own.
Relevance is the most important consideration when it comes to those links. For example, if you get other sites to link to yours who are in an industry that has no connection to yours, chances are that the tactic will just confuse Google's automated page ranking programs even more.
However, if for example you work in health and safety and get a top Google ranked website such as the Health and Safety Authority to carry a link to your website on its page, this will be assessed as better quality than a link to a local web design firm.
Some firms persuade relevant sites to carry links by promising to place one on their site in return. “It's better to have an incoming link from a site you don't reciprocate,” said Heraghty.
“But a reciprocal link is better than no link at all.” Link farms, sites that link to hundreds of random websites with no seeming connection or relevance, are a definite no-no and can actually end up negatively impacting your search engine score. Coburn pointed out that these may even end up diluting your relevancy score, which will undo any other good word you had already put in.
Link farms are considered a “black hat” technique. “Black hat search engine marketing techniques will work, but only in the short term - if your site isn't banned first - so my advice is focus your energies on creating a content-rich site with a clear hierarchy and an abundance of text links,” said Murray.
There are certain things that websites should avoid at all costs, such as hidden text, multiple URLs and domains with duplicate content and deceptive redirects.
These will be considered spam by the search engines and could result in your site being bumped back down the listings for employing underhand tactics.
“The key thing I would avoid is keyword loading and cloaking,” said Veale.
“This is where the site hides the keywords somehow. It's not for the customer, it's for the search engine. If the search engine figures out you are doing it, they will blacklist you and you will be taken from the listings.”
It's not an idle threat - blacklisting certainly happens. The most high-profile case was that of car manufacturer BMW, which found itself booted off Google's search index last year for using ‘doorway pages' to boost search rankings.
These ‘doorways' display one page to Google's search bots and another to users, and target numerous keywords.
What firms have to keep in mind too is that building up an organic search listing is a lengthy process, and taking shortcuts is probably ill advised.
“Trying to get your page found organically can take time,” said Bates. Rome wasn't built in a day, and, it appears, neither was Google's search index. If you are seeking instant gratification and visibility, the organic listings are not the place to find it.
Of course, it has its advantages. “Organic search engine optimisation is a more popular way to attract visitors to websites than other online programs such as pay-per-click, online advertising, e-mail marketing, sponsorships, etc, due to the opportunity to gain longer search engine visibility thus providing a greater return on investment,” said Murray.
And it seems that users are more likely to pay attention to organic results than their paid-for counterparts.
“Web users are six times more likely to click on organic search listings than paid listings,” said Bates.
“They are more likely to trust the organic results.” However, pay-per-click also has its uses. For example, if the key phrases you want your website to be found under are highly popular, you may have little or no chance of getting on the first page of results any time soon. One way around this is to pay for the listing on Google, through Google AdWords.
Buying space on Google is also pretty immediate. The downside to that is that, should you decide to ditch the paid-for campaign, your premium listing space will also disappear, and you may be back where you started on the search rankings - invisible.