Google Algorithms For Search

Hi, we all know GOOGLE first algorithm is PageRank which is developed by Larry Page for fine searches. But there is need to understand why GOOGLE is continuously changing his search algorithms? Currently, Google search engine runs FRED algorithm which was launched on 8rth March 2017 with the goal “find and filter those search results have low-quality content, but anyhow they ranked up to generate revenue.”. As Himanshu Punitha already discussed Google Search result Poisoning, it is not right to explain again, but Google is continuously working on this domain, since RankBrain Algorithm. So that, this kind of searches filtered out from the main search with the help of Machine Learning and Artificial Intelligence.

GOOGLE Search Algorithms
S.No Algorithm Name Launch Date
1 PageRank Sep 1998
2 Panda Feb 2011
3 Penguin April 2012
4 Pirate Aug 2012
5 HummingBird Aug 2013
6 Pigeon Jul 2014
7 RankBrain Oct 2015
8 Possum Sep 2016
9 Fred March 2017

Brief Information
To optimize search engine results Google launched Panda, but within one month it was rolled out due to poor performance. Although, Google started using Panda Algo again since 2016 but with the other algos as dominating one. Panda mainly developed to stop plagiarism, i.e., duplicate content websites and keyword stuffing. After that Penguin and Pirate algos work on similar content with PageRank as a backbone but all these algos have more or less similar goal and doesn’t contribute much in search engine results.
The most prominent change in search result comes after HummingBird Launch. In this Launch, Google comes out with the significant difference in its algorithm, and primary goal of this algorithm is not to filter out results but to return those results which are more relevant to user query by understanding what exactly user wants through gathering some of his past queries and similar kind of queries asked by others. To return good results natural language processing techniques like cosine similarity, tf-idf, semantic indexing, etc. are used. Even this algorithm rank pages even query words exactly not exists in the document using word vector.
Pigeon is launched over the improvement of HummingBird with the idea of search results may depend on Geographical location. Now ranking is started based on local and global rank where local page ranking is different for every different location but global is same. Pigeon is the more sophisticated algorithm; it results in high accuracy and fast return. Most significant improvement is NAP Consistency. NAP stands for “Name Address Phone” it plays an important role in page ranking as disclosing information shows that blogger or website is genuine due to confidence but inconsistency in NAP may result in the poor ranking of vendor website. NAP inconsistency occurs because of the existence of the same vendor with different identification on different sites. Hence, Pigeon rule internet over one year. After that Google tried to improve its performance over mobile and releases it Mobile based algo but that failed to capture good result as well as performance.
RankBrain is the first algorithm of Google based on machine learning system and artificial intelligence it can correlate meaning behind queries and can serve best search results concerning queries. Even Google still says RankBrain play a third important role in deciding page ranking. After it, Possum is released with a goal to capture competitive site or company corresponding to query search. So that user not only biased toward/ rely on the single company for their work. They can look competitors of that company which are similar work as required by the user so that user have a variety of option in choosing. It mainly based on the physical address of vendors/ companies.
Fred is current algorithm on which Google returns its search result. Fred is not entirely a new algo, but it is in the chain of HummingBird and PageRank or what we can say the evolution of HummingBird. Fred target all those websites who have the main purpose is to generate revenue through advertisements and consist of poor quality data.
Hence Fred wipes out many of the shortcomings of previous algorithms. Now thin websites, i.e., have a less content not get the complete penalty in ranking as they are getting in Panda. Moreover, after geolocation and addition of NAP consistency in search results leads to the market of faith because everyone wants their website on a top list of Google for that they like to poison search with different techniques. Due to AI and ML, Google Bots understand the malicious content and sharply decrease the ranking of those websites. Hence, nowadays techniques like keyword stuffing and spam linking(i.e., linking coming from spammy sides) don’t affect search results.
It is good to follow Google guidelines otherwise your website suffers from massive penalty through page ranking algorithms.

References
1. https://www.link-assistant.com/news/google-algorithm-updates.html
2. https://searchengineland.com/8-major-google-algorithm-updates-explained-282627

Comments