The purpose of a search engine:
Search engines are effectively information retrieval systems. The search engine was designed with the intent of assisting the searcher in finding what they were looking for as quickly as possible. Therefore, the search engine’s primary object is to satisfy the searcher’s information need above all, by providing the searcher with the most relevant information possible, ideally contextualised from the actual search query.
How do search engines work:
Search engines make use of software called website crawlers to crawl the Internet in search of web pages relevant to their query. The web pages that are visited by the crawlers are tagged and sorted by content. This process is referred to as indexation. Once a searcher queries the search engine, the actual phrase is applied to the organic ranking algorithm (formulas and programs that consider over 200 elements), which in turn organise the search engine results on the search engine results pages (SERPs) in order of perceived relevancy in context of the query.
The following two important aspects are derived from the above:
1) If the website cannot be indexed by the crawler, then the website cannot possibly rank
2) Web pages rank, not website
Search engine optimisation (SEO):
It is deemed prudent to realise that SEO is based on a life cycle and should not be perceived as a project, with a beginning and an end. The reason being, that organic relevancy ranking is relative to contextual competitiveness (irrespective of industry) at any particular point in time. This has resulted in the general perception that “the internet is constantly changing”. In fact, SEO ideally should be segmented into two categories:
The crawl-ability of any given web page is essential for the website to become visible to the search engine. The optimisation process conversely, is to better contextualize the information presented on the website in order to align the context of the presented content with the context of the search query. This results in the elevation of the actual web page (containing the contextual content) listing in the SERPs.
Two search engine optimisation philosophies exist, as being defined by the industry.
1) Black Hat
2) White Hat
Although many more permutations of these philosophies exist, the truth of the matter is that every permutation is associated to either the one or the other in some way.
The black hat philosophy is in most cases the isolated optimisation of a few critical elements (ideally as many as possible) of the organic ranking algorithm in order to dominate the top ranking positions for targeted keywords. Doing so too often will violate the search engine’s best practice guidelines.
The white hat philosophy on the contrary is aligning the search engine optimisation process with the search engine’s best practices, and more importantly, the overarching vision of the search engine.
Ironically, it would appear that the SEO industry (at least the majority thereof) would create strategies that make use of “white hat components within the black hat philosophies”, thus rationalising the use of black hat strategies.
The above argues why the search engines are forced to place so much emphasis on fighting spam as an essential component in improving relevancy search results.
A white hat philosophy is by far the more difficult strategy to apply as positioning is earned not manipulated.
Spamdexing refers to websites that attempt to deceive the search engines, whereby the results provided to the searcher are non-relevant when considering the search query. Search engines consider this behaviour to be unscrupulous and unsolicited, and are as a result, more than willing to punish a domain (by removing or deteriorate the domain ranking position in the SERPs) that participate in “spam”.
As mentioned, owing to the direct association between search engine rankings and searcher behaviour regarding the viewing of SERPs, it becomes apparent that website owners may attempt to manipulate the search engine rankings (black hat philosophies) in order to increase their business’s exposure. The reason being, there is money to be made. Unfortunately, search quality cannot be easily measured owing to the fact that the search results presented are relative to the searcher’s perspective concerning the search query. Although organic ranking algorithms updates / penalties are applied to the SERPs in order to better manage “spam”, it may one day boil down to search engines changing their algorithm structure and even their philosophy.
It has always been difficult to measure the performance of the SEO digital marketing channel. In the past, SERPs “rankings” were the ultimate “be all & end all” performance metric of SEO, not to mention the SEO marketing driver on any SEO campaign. This has however changed, in fact ranking has not been considered the primary KPI metric since 2008 (in particular for Google).
In principle, the Internet is built on the HTTP architecture (language protocol). As with any language, there are rules. Owing to the fact that search engines need to function within the restriction of these rules, it is apparent that there are only so many things that search engines can consider as part of the organic ranking algorithm. In summary, search engines (Google in particular) have identified and are constantly tweaking the components in the organic ranking algorithm considering the protocol limitations. In order for Google to expand the organic ranking algorithm beyond the scope of these restrictions, Google is focused to look at individuals by means of the Google verticals (Gmail, G+, etc.) to apply context to any given search query. As a result, “rankings” take on a secondary KPI position in terms that measure SEO performance for the following reasons:
1) Making use of Google verticals will customise SERP results to the search engine’s contextual interpretation of the searcher’s needs.
2) Search queries are customised to consider the context of localisation and IP address ranges of any given search query.
3) Google has stated that one out of every five searches conducted on the internet is based on an experiment.
4) Google makes organic ranking algorithm updates daily, indicating that rankings would in all likelihood change on a daily basis (apart from the major updates and penalties that Google rolls out)
5) Just because a particular website is ranking very well for a targeted keyword, does not mean that the traffic is contextually aligned with the need of the search. Google will always be improving on this, but in most instances search queries are refined after the initial search is conducted, owing to the fact that SERP results do not align with the searcher’s information requirements.
6) It is also very important to note that rankings do not equal leads / conversions.
It is the author’s view that SEO clients should not only understand the principles behind SEO, but also the philosophy and methodology. The reasoning is twofold:
- SEO cannot possible be achieved without the website’s business owner’s insight.
- SEO can no longer be applied in isolation, meaning integrating all marketing channels to achieve the website business objectives is of paramount importance.
Finally, it is important to realise that if a white hat SEO methodology / philosophy is followed, then there is no need for concern over search engine penalty / spam updates.