The panacea of SEO

Consider the size of the web for a second, just how mind-bogglingly big it is. According to the it’s currently somewhere in the neighborhood of 45 billion web pages. These are the pages that at the moment are indexed by Google.

Google’s search algorithm keeps getting better at ranking pages. There’s really no other way to put it. Looking at it from a negative perspective is counterproductive. Instead of thinking “This used to work on Google to improve ranking and now it doesn’t, it’s better to say “now Google ranks this better”. Ever since Google began naming its updates you can read tons of posts describing how thousands or millions of sites got penalized by Panda, Pengüin or Hummingbird which is understandable from the SEO point of view. But, at the risk sounding naive… Isn’t it better to look at it from the bright side? How about saying “Millions of sites improved their rankings thanks to the Panda, Hummingbird or Pengüin update.” Isn’t better to work with Google instead of against it? Let’s see how this point of view is more profitable in the long run:

The Panda update from 2011, takes into consideration the quality of the content posted on a webpage. Panda is a sitewide penalty, so if on a website some pages rank below Panda’s quality algorithm, the whole site suffers. Panda’s raison d’être was, in the beginning, to penalize sites providing utterly useless, generic, low quality content that through keywords and little SEO could rank positively and result in users clicking ads.

These sites can be profitable to the webmasters making them, who are in it for the quick buck, but are a nuisance for users who land in them hoping to gain knowledge and end up wasting precious minutes of their time if they’re seized by curiosity and end up clicking one of the curiously well crafted titles on the adds.

Panda has since, evolved further, to tackle scrapping. Given that there are many different connotations to scrapping in the SEO world, let us ascertain that the kind of scraping Panda was meant to tackle was the kind used for “content creation” by scrapping one site completely and duplicating it with little change other than cosmetics. I think we can all agree that these sites were (and still are) also a nuisance, particularly for the user who in the hope of finding different points of view on a given subject ends up loading page after page of the same text sometimes with small changes to disguise the same essence  and sometimes a word for word copy with only a different URL and minor changes in fonts and graphics.

Pengüin, was released in 2012, to penalize  spammy links. When the algorithm wasn’t as sophisticated, one of the factors it considered to reward or penalize ranking was the presence of backlinks. The idea was to reward sites who had more backlinks because these are an indication of a site’s authority and good reputation. When some people saw this they thought they could game the algorithm by simply adding any kind of backlinks and they begun adding backlinks to spammy sites and they did fool the algorithm, for a while… Then Pengüin came and their rankings were massacred. Now, sites that want to use backlinks to improve their rankings need to use quality links, from reputable websites and these have to be gained, because linking to a website is almost  just like approving its content.

But the importance of backlinks will dwindle over time. It was one of many ways of trying to make sense of it all, in the quest for providing the user with reliable answers by ranking highly sites with good reputation or sites endorsed by those, outsourcing the evaluation of a site’s content to something very much like a peer-based review system, because the algorithm wasn’t clever enough to understand  the different shades of meaning that a single word can have as part of a complete sentence, let alone decide if the content on one site was good or bad. Google though, as of 2012 alongside Pengüin, begun using a knowledge base that gathers semantic-based knowledge called Knowledge Graph,  in their own words, making “search” about “Things, not strings” that among other things begun using user’s search history to better understand what the user wanted and gathered knowledge from reputable sources like the CIA factbook, Wikipedia etc.

In the Summer of 2013 Google launched a major update to its search algorithm called “Hummingbird” that pays attention to each word in a sentence ensuring that the meaning of the whole sentence is taken into account instead of just words in particular. This is when Google went Siri.

The algorithm is getting better and better understanding both speech and text.

How do you game such an algorithm? The answer is: You don’t. Not if you’re smart anyway. Consider the really smart and successful… Do you think they got to be where they are by taking advantage of present conditions? From Wall Street to Palo Alto, the really successful, the really smart, were able to succeed by staying ahead of the game anticipating trends, needs and wants. If you want to know the goal of Google when it comes to perfecting its search algorithm, what they are looking to achieve is the Star Trek computer and it you ask me, we’re not even 10 years away from it. Quality content that is relevant is the panacea of SEO. Content is King. Get use to hearing that, accept it as a motto and make it the foundation of everything you do online. Embrace a sincere desire of helping fellow men and furthering mankind. Take a sincere interest in your audience. If your content has quality your foundation is strong and won’t be affected seriously when the algorithm evolves. The trick is simply staying ahead by guessing how Google will evolve and acting accordingly.

Staying ahead of the game – Creating Content that is more than just useful

When defining what quality content is, we have to look at several things. Let go from the simplest to the more complex:

Language:  One of the things the algorithm is currently able to ascertain is the quality of the language used in whatever page is being ranked. Good grammar is a necessity and spelling mistakes in this day and age are unforgivable given all the advantages we have in text editors.

Good writing: It is a pleasure reading through text that has been masterly crafted by a good writer. The goal should be that the reader is able to read and become absorbed in the reading so that the text is not in the way. If you can afford it, professional writers are worth it, but you can learn how to write engaging text if you set your mind to it. There’s plenty of advice online but you should know that the best writers read a lot of books. If books aren’t your thing though the best newspapers are always very well written. While some writers are clearly more gifted than others the capacity to write well is within the realm of possibility for anyone who completed high school. The best writing uses the strongest words to convey meaning. Being vague about things will diminish the quality of your text. Craft strong meaning clearly without taking extremist points of view. The algorithm is currently capable of understanding the meaning of the text

Reputable sources: “Originality is nothing but judicious imitation. The most judicious writers borrowed one from another” Voltaire. In whatever field you might be there are experts from whom you can and should learn.  Do not copy but imitate them. Try to get at the same conclusions they got using their sources when possible and with your own spirit infuse them with your touch. Try to contribute yourself by applying your own wisdom and experience into the mix.

Text, images and videos to your advantage: Even the most gifted writers need to rely on images and videos these days to help establish empathy. Attention is the scarcest commodity on the planet. If you want to grab it from your readers you need to tune your page with a mixture of text, images and videos that should  ideally synergize providing a richer experience for the user. Images and videos are considered by the algorithm when ranking pages and good images will help your ranking specially if they’re original. Bear in mind though, that text embedded in the images you choose to publish will not be read by the crawlers when Google indexes your site. The page where your image is, captions and the text providing the context are all considered by the algorithm. It is a good idea to place each image the nearest possible to relevant text giving it a strong context. Ideally you should use original images, i.e. images that you have taken yourself and are therefore your property. Since users have a tendency to copy images they like, the best practice would be to offer them with a creative commons licence and ask users that copy them to link back to your site and give you credit for your image. The same applies to videos. You should consider that everything you use to enhance your text has the ultimate purpose of providing the user with a better experience. So it’s not a good idea to post an image or a video that will hamper the load time of a page, because you run the risk of defeating the purpose of providing your readers with a good experience and the may just skip your content in favor of something better optimised.


Eyes on the target: Gain clear insight into your audience and laser focus your writing to it. Write the information they need today with a focus on what they will need tomorrow. Nothing is permanent in life. Every situation is evolving either slowly or rapidly and finding out where things are headed is what the game of staying ahead is all about. Animals know when it’s going to be sunny or when it’s going to rain. For them it’s an issue of survival. The same applies here… Dress your writing according to the current weather but decide the garments you’ll wear the next season before it’s imminent. When addressing your audience this laser focus should guide you towards keyword research. Remember the reason a keyword is ranked so highly is because in its purest essence it represents what the user wants. While the importance of keywords will dwindle over time, I don’t believe they’ll ever become irrelevant for that simple reason.

Tuning the text for Google=SEO

Now that we’ve established that content is king, let’s see how we can tune it to better please Google. There are currently about 200 factors that Google’s algorithm takes under consideration to rank pages. Brian Dean, a SEO expert and founder of “Backlinko” one of the most popular SEO sites on the web has done an excellent job in his Google’s 200 Ranking Factors: The Complete List, which happens to be the most updated and thorough I’ve come across so far.

There are a number of  “off site” factors the algorithm takes under consideration that we can use to our advantage. Let’s consider them in detail:


Off-site  SEO


Keywords in the domain:


When a query is issued to the search engine, the domain that contains this keyword has an edge over the the domain that doesn’t. The edge is bigger if the keyword is present in the top domain and also if it’s the first word of the domain. A keyboard present in a sub-domain can also boost rankings so whenever a keyboard is present in the domain there’s a boost in rankings. However, even in the case of exact matching domain (EMD) low quality content will kill any edge gain by even EMD.  The algorithm subjugates any boosting to good quality content.


Public Whois:


A domain with a clear owner will rank higher than a domain whose owner is not clear. Private domains aren’t looked upon favorably by the algorithm. Privacy concerns aside they’re viewed as domains whose owners have something to hide either on their character or on the content of their site. The owners of penalized spamming sites endanger  other domains they own so privacy protection services make sense for them but for everyone else suffice it to say that it’s a good sign for the algorithm when it sees that the owner of the domain is not afraid to say who he or she is.

Domain ethos:

A domain’s registration length is a strong indicator of its legitimacy and is therefore -as it should be- a factor to consider in its ranking. When the ownership of a site seems to change often it’s generally frowned upon by Google and they might just reset its history, leading to the breaking of the links pointing to it.

Let’s look into some on-site SEO.  

On-site SEO

As I have argued before, we can expect that any sort of SEO aimed at boosting rankings artificially is facing an up-hill battle in the best of cases. Breakthroughs in A.I. have been so huge as to motivate fellows like Stephen Hawking and Elon Musk  to ask for it to be regulated. Only several hours ago I just learned that Facebook’s Iphone app now has the ability to describe pictures to the blind, I mean, come on! When there is a will there is a way but let’s not even go there. In any case if you’re curious I’d recommend having a look at “50 shades of SEO” for  theoretical purposes. But let’s not deviate from the path…


SEO should focus on improving user experience and making Google’s job -or Yandex’s or Bing’s- easier and make the web and the world a better place.

With that in mind we’ll have a look at some factors that should boost rankings deservedly with a focus on content.


The relevance of Keywords:


Keywords are still very relevant and their presence as an integral part of the content is regarded as a sign of its usefulness by the algorithm resulting in better rankings. Too many keywords in the text though and it may be penalised as keyword stuffing. But the art of placing keywords relevantly in the content without overdoing it is advantageous for the right reasons and highly recommended. When keyboards appear in the first 100 words of the content it’s an indication of relevancy that is very favorable.


Title tags, tags, H1 tags and metatags, are all integral parts of the content that have a very important job. When keyboards are present in the content and in the title tags the algorithm perceives it as a sign of useful content. When a keyboard is also the first word in a title tag it’s even more auspicious. In short knowing your keywords is crucial and diligent research is warranted. Brian Dean has also done a wonderful job in Backlinko  with his guide to keyword research . The order of keywords is also relevant and the exact order of keywords queried, on your text,  will rank more favorably than keywords in a different order.

Length of content, links and backlinks:

The length of content is particularly important. The magic number is 2500 words. Posts with that length have a better chance to be considered of good quality. When you link to a post or a site with at least that number of words it’s better regarded than when you link to a rather brief text. Solid brands are well regarded by the algorithm. The location of the links within the page is also important. Generally links embedded in the text are more favorable than links in the bottom. The relevance of links is also an issue. Links to sites and posts relevant to the content give a greater boost than link to reputable sources that aren’t relevant to the content.

Finally a site’s usability is crucial to its rankings. If the site loads quickly, is mobile-friendly,  updated frequently and has very little downtime it’s well optimised and that’s as important as everything stated above.

In SEO as in life, one must choose wisely. The panacea of SEO is content, Content is king and  the user is the raison d’etre of everything that exists on the web. Consider this thoroughly.


We will be happy to hear your thoughts

Leave a reply