Important Google Algorithm Updates

Google interact with users as a personalized assistant and people all around the world began to trust Google than any other source for information. The business webpages with high ranking made good profits and connections comparing the webpages with low ranking. The webmasters and website owners began to use cheap and low malpractices to increase the website ranking which resulted in low quality or spam content websites. This caused much in convenience to the users. Google whose main priority is the trust of its users, formed a team to adopt measures for good results and to maintain the website quality. Hence Google introduced various algorithms. 


GOOGLE PANDA ALGORITHM


panda

Google Panda update, also known as Farmer update was introduced in 2011. This update was based on Content Spamming. Some of the content spamming includes

1) Content Duplication - Where content is copied from other websites.

2) Quality Less Webpages - Low quality websites or websites with grammatic errors and spelling     mistakes.

3)Thin Webpages - Websites with less content

4) Keyword Stuffing - Over use of keywords in a webpage for the sole purpose of ranking.

5) Content Spinning - Same content is rearranged in different pages.

Panda algorithm was a big success in avoiding spam content. In 2014, May 19 the Panda 4.0 version was introduced which became a permanent filter in the Google.


PENGUIN ALGORITHM


penguin


In 2012, Google introduced Penguin algorithm update which is based on link spamming. This includes

1) Paid Link - Here link is purchased from link sellers.

2) Link Exchange - Exchange of links through the webpages. 

3)Link Scheme - Automatic link creating programs or applications. 

4)Comment Spamming - Sharing links through the comments in a website. 

5) Wiki Spamming - Login to Wikipedia as a volunteer and link is added to it. 

Penguin Algorithms also blocks link farming, link automation, low quality links, over use of internal links, over use of anchor text and manipulation of guest blogging. In 2016, Penguin 4.0 version was introduced which punish the website with link spamming in real time.

  

PIGEON ALGORITHM UPDATION


Unlike the above updates, Pigeon Algorithm is used to promote Local SEO. For some category of business searches, location based search results of I.P address is listed. For that, first we have to submit our business in Google "My Business". A location page containing the address, phone number and pin number must be entered in our webpage. Submit our website in local directories like just dial. Placemark must be embedded in our contact or about webpage. Social media interaction must be done targeting local regions. The Pigeon Algorithm was first introduced in U.S. It was a huge success. It is a very important algorithm since it promotes local business searches. 


HUMMING BIRD ALGORITHM


humming-bird

Google Humming Bird Algorithm was put forward in 2013. It was based on semantic search results. This gives a deep information of a website. Interactive questions will be answered by indexing the related topics. Semantic search results use feedback for ranking purpose. The disadvantage of this algorithm is that, even if the feedback is incorrect, Google will list it in the results. 


RANKBRAIN ALGORITHM


In 2015 Google introduced, RANKBRAIN algorithm which is considered as the update or second part of HUMMING BIRD Algorithm. This algorithm uses the Artificial Intelligence for tracking user interactions. Computer or system acts as a human being. Automatic search results are provided. 


 MOBILE GEDDON


Mobile Geddon in 2015 is a mobile friendly update. Since more people started to use internet in their smart phones Google began to concentrate on mobile friendly websites and started to change ranking accordingly. A mobile friendly websites can be read websites without zooming or tapping, there should be no un playable content, no horizontal scrolling, tap targets must be spaced appropriately. For one year, Google allowed high content webpages for ranking but after introducing Mobile Friendly 2.0 version in May 12, 2016  Google followed this algorithm strictly. 


PARK DOMAIN UPDATE


If a domain is available, we can book that domain without creating a webpage. These type of domains are called Park Domain. There will be no content in the webpages but advertisement can be given in it. Google will index it. Since these park domain causes inconvenience to the user, Google introduced Park Domain update. 


EMD or EXACT MATCH DOMAIN UPDATE


In 28th September 2012, EMD update was introduced. We can create domain name using our focusing keywords. But if that website is not active for a long time, then Google will remove that webpage from ranking. 

PIRATE UPDATE


If contents, images, videos of a websites are copied to another websites without the webmasters knowledge, then this can be reported to the Google and action will be taken against the culprit using the D.M.C. Act ( Digital Millennium Copyright ) of 1998.   


BERT (BIDIRECTIONAL ENCODER REPRESENTATION FROM TRANSFORMERS )


In 2020, December BERT update was introduced. This is a local or natural language processing update. It was first introduced in English, then afterwards in 7 other languages including Hindi and Arabic. 

Search Console and its Verification Methods

 Google Search Console (old name is Google webmasters tool) is a basic tool used in the process of search engine optimization. It is a free resource provided by google which act as a communicating media between the user and google. Using this platform, google can alert the users about the mistakes and errors in a webpage. Google treat search console as a mock point.

 

google-search-console


 Search console enables the SEO activities in a controlled form. It help us to easily target a particular country, specified audience, language and also helps to find issues regarding device compatibility, mobile compatibility and browser compatibility. 

First, we copy the URL and go to the Search Console. Google id used to login to the Search Console must be same as that of our Gmail id. Then automatic verification will take place. Otherwise, verification have to be done. Reaching the Search Console, we see 2 options, domain and URL prefix. Since we are pasting a sub domain, paste the URL in URL prefix.  Press continue button. Click property to verify. Select URL inspection option and paste the URL. Click enter. If it shows URL unavailable, then click Test Live and click enter. Now It will show that the URL is available. Then click request indexing. Now the page will be crawled.

As a part of SEO, we use Sitemap, a xml document that contains all the related URLs of that webpage. It contains the details like the frequency changes, last modified date, priorities of the URL. The sitemap enables the google to easily understand and crawl that webpage. The maximum file size is 10mb, more than that will be shown as invalid. The maximum number of URLs in sitemap is 50000. To include more than that limit we can categorize the URLs. Google can quickly understand the xml document in few seconds than a html file which takes half an hour or more. Customized xml files can be easily understood by users. 

Search console has 5 ownership verification methods. They are: 


1) HTML FILE VERIFICATION

 Html file verification is the most recommended method. There will be no serious issues if there is any change in meta tag, template, analytics, designs. It will not break the communication with the search console. Google will provide a html file .That must be uploaded  to the root directory of our website. Then the verify button is clicked. Verification is done. 


2) HTML TAG VERIFICATION

    Search console will provide a html meta tag which must be copied. Go to the setting of your website and select theme and paste the meta tag in the head section of the HTML, before the first body section. Then click verify button.

 

3) GOOGLE ANALYTICS   

It is a free resource from google for the process of SEO. It provides the complete details about the website visits, interaction time of users and more information regarding the resources. Verification can be done using the Analytics tracking code. 


google-analytics



4) GOOGLE TAG

 In this method, we use google Tag Manager. Google Tag is shown as a container snippet. The Tag Manager id is used to verify the ownership. Tag Managers are JavaScript codes that can be pasted at head and body section. Thus we can include all the changes and modifications even without knowing the coding details. 


tag-manager


  
5) DOMAIN NAME PROVIDER


 Purchase a domain name from domain name providers like godaddy.com or namecheap.com. Copy the text  provided by the search console to the DNS data entry option. Then click verify button. Verification  is completed. 

On Page Optimization in SEO

 On Site or On Page optimization deals with the process of making the web pages clear and easy to access for the bot. It is the optimization taking place inside the web page. Where as, Off Page optimization deals with optimization process outside the web page which is done to increase the rating. 


onpage

A HTML mainly consists of head part and body part. Head part consists of URL, title and meta description and body part contains the content. Google first analysis the URL, from which it gets a hint about that web page. Then it analyze the title and the meta description followed by the content. Thus it get the clear idea of that web page. The search engine results listed in the Google is called a snippet, which consists of URL, title and description. In mobiles, favicon is also shown in the snippet. At first, head section is optimized. On Page optimization methods involves,

1) Title Optimization 

Factors taken into account during title optimization is that, there must be no spelling or grammatical mistakes. It must not be full capital or full small letters. Number of characters are limited to 55- 60. Google search box capacity is 70 characters and pixel width is 512. If the title is more than 512 pixel, google will show the exceeding characters as dots which is very un attractive to the viewers. The title must be unique and relevant to the content. It must have focusing keywords.

Search console, old name is google web masters tool, is a google provided free console which is a platform for google to report the mistakes or errors of a web page.

search-console


2) Meta description Optimization 

If Google find our meta description as in appropriate or irrelevant, it takes the h1 or h2 tag titles as the meta description. Hence the meta description must be attractive, it must have focusing keywords. There should be no spelling or grammatical mistakes. In meta description, character limitation of a blog is 155 (since published date is also included in it by the google) where for web pages, it is 155- 160 characters. The pixel width limitation of a meta description is 1024 pixel. Robot meta tags are used to give instructions to the bot, whether to index or follow a page. Click through rate (CTR) is the percentage rate of number of impression to number of clicks.

<meta name="robots" content="index, follow"/>

3) Body section Optimization 

The main heading is the  most important part in a body section.  This heading or h1 tag can have 7 to 8 words. It must be free from spelling mistakes and grammatical errors. Only one h1 tag should be present. It can have matching keywords. If more than one h1 tag is present and is contradictory to each other then it will be confusing for the viewer and bot. Hence Google crawl and save this type of pages into a temporary database called sandbox.

Only one h2 tag is enough. In case  of large articles, more than one h2 tags can be used. After h2 tag, h3 and h4 tags will be analyzed by google. These all tags must be relevant to the content.

4) Anchor Optimization 

Use of anchor tags that adds hyperlinks, boost up the crawling process of that page. Our focusing keywords must not be used as Anchor text , which leads the viewers to another website that also uses the same keywords. This is because google takes the links to other sites as a recommendation votes thus our website will loss its rating.

anchor


5) Keyword Optimization 

Keyword density is the number of time the keyword is used in a web page. Google does not take into account the keyword density but keywords must be used one or more times for better ranking. According to the old school SEO practice, focusing keywords are made bold and used before a coma or a full stop in order to increase the ranking. But these practices are not approved by Google for ranking.

6) Image Optimization  

Mainly it consists of file name, alt attribute and content. File name must be same as the alternate attribution. For writing the image name, hyphen must be used instead of space in case of two words. eg: image-name.png



History and Evolution of Search Engine Optimization

seo


HISTORY OF SEO

Search engine is a software that is designed to perform information search in an internet. Search Engine Optimization (SEO) is the process of improving the search results in a search engine. One of the most prominent search engine is Google.

The Google was invented by two students as a part of their university project in 1998. First, it was used as a subdomain under their university website in which the search results will be sent to the submitted email id. This process took more than 24 hours time. By 2000, internet was widely used all around the globe. On September 11, 2001 following the Al- Qaeda attack at the World Trade Center in America, people were searching Google for the news regarding the same but the results were disappointing. This made Google restless, since it could not provide the details about the world's largest building of that time. The engineers in the Google pointed out that the problem was that, not all the available web pages were  crawlable to Google.

 Now what is crawling? The search engine process through 3 steps. Crawling is the process by which a program called bot, (also called robot or spider) scan a new or updated page. The snapshot of that page is saved into the database according to their categories. This saving process is called Caching. The search results are retrieved by the process of indexing.

Now, in order to crawl every web page in the internet, the web masters who controls the web pages, must be given guidelines on optimization practice. This rise a concern among the engineers in Google since it belongs to their secret files. But at last with no other solutions left, they published 32 page starter guide document called “SEO Starter Guide”. Thus Search Engine Optimization started.

  EVOLUTION OF SEO


In the beginning, Google used niche specific or content specific. The keyword meta tag concept was used in ranking the web pages, where ranking will increase with the occurrence of keywords. This led the webmasters to use Keywords stuffing, a black hat SEO technique (unethical practice) where there is over use of keywords within a web page. Realizing this, Google changed its algorithm to link specific. Here the website that receives more links from any other websites will be ranked accordingly. Each hyperlink will be considered as recommendation vote

The loophole in this case was that, the website owners started to sell the links to the webmasters who were looking for the same. Then google put forward the quality link specific approach, where pages were ranked out of 10. The ranks referred to the trust value of that web page. Here 200 other factors were also considered along with the links. Then web page owners with high page ranks started to sell the links to the web masters. 

Hence google introduced a new tactics ‘Passing the Juice’. If the webmasters with good page ranking gives link to other web pages, their ranking will be reduced. Link value is referred as the value or equity passed from one page to another page. Google also put forward ‘no follow‘ mechanism, where equity passing is stopped. 

<a href="http://link-path.com" rel="nofollow">anchored text</a>

Why was Google hesitating to change its algorithms? During the period of 2003 to 2008, Google had gained more popularity and growth and entered the field of advertisement. Google AdWords, later changed to Google Ads or Pay Per Click (PPC) is Google advertising service for business purpose where advertisements are displayed in google. AdSense is a program run by google through which website owners could display the business advertisement in their websites. The change in algorithm will results in the change in the keywords which will affect the viewers website visibility which leads to the decrease in Google’s revenue.

adwords

In 2009, Personalised search results and auto suggestions were introduced. Google began to interact more with the user. Even web page ranking was based on user interaction. Pogo Sticking - when a user clicks the first web page and leave the page with out spending time in it and visit the next page and the interaction time in that page is more, then the google give top rank to the second page. By 2010, social media platforms like Facebook and Orkut gain more appreciation among internet users. Even ranking was based on social media considering the influential power. 

Important Google Algorithm Updates

Google interact with users as a personalized assistant and people all around the world began to trust Google than any other source for infor...