History and Evolution of SEO

 HISTORY AND EVOLUTION OF SEO


What is SEO?

Search Engine Optimization

SEO stands for "search engine optimization" which means, it is the process of improving the visibility of a site to the users when they search using relevant keywords. Businesses rely on SEO to improve their website ranking when users search using search engines such as Google, Bing, etc.

History of SEO

The history of SEO goes back to the year 1991 when the world's first website was created. In 1994, big players like Alta Vista and Yahoo joined the scene of SEO. But SEO became what it is today thanks to two college students who submitted their project, which is the biggest, most recognized search engine to this day. The search engine was called "BackRub" and it eventually became Google and was registered as a domain in 1997. Initially, it was used by the founder's friend circles and family and when a query was searched, no instant results were given like we get today and it can even take 24 hours to give you a result. During the initial launch of Google, they faced competition from Yahoo which was a search engine everyone knew of and even had subsidiaries such as Yahoo mail, Yahoo news, Yahoo answers, etc.

Google brought SEO to where it is today after the incident the incident in September 2001 (also referred to as 9/11) which shook the entire country and even the whole world. It was at this moment many people started to search for "World Trade Center" in Google search and it didn't yield any search results. As a result, Google officials took this matter seriously because of two reasons. First reason is that this tragedy is one of the biggest one to hit American and everyone in the world came to know it and is expecting a result, while the second reasoning is that, it was the biggest building in the world at that time. During the meeting with search engineers, it was revealed that most of the search results in Google was not crawlable

Typically, search engines work through 3 primary functions.

  1. Crawling: Scanning of a new webpage via web crawlers or spiders.
  2. Caching: Taking the snapshot of the said webpage.
  3. Index: The snapshot is saved in the index category, which functions as a library.
Unlike today, webpages were displayed as a showpiece 25 years ago. It didn't had any relevant information nor contents and only displayed various animations, where the crawlers didn't had the ability to detect its category. 

The main objective was to make the most webpages crawlable and it results in most of the webpages being displayed, which will have huge impact for the future of Google. For this to happen, the webmaster(website custodian) had to decide whether they wanted their website to display in the search result and make changes to the website. Google decided to make their SEO Starter Guide public to the webmasters, where they began optimizing their websites to get displayed in search results. Business have started to get leads as a result since their websites were optimized and was displayed in search results. 

Even though things are starting to get good, one of the human tendencies i.e. "greed" can overtake us all, which ultimately destroy the sand castle we built from scratch. Some webmasters began over-optimizing their websites which led to low-quality searches in Google and this has the potential to breach trust between Google and its users. For example, imagine you are getting a website which posts fake reviews of films or the information's mentioned in the webpage are wrong. This would inevitably affect your trust with Google. In order to prevent this, Google made changes to its algorithm which would prevent this webpages from getting displayed.  Are you wondering why over-optimizing a webpage leads to bad search results and what change did Google made to prevent this? This will be explained when we talk about Evolution of SEO.

Evolution of SEO

Now that we have come to know the history of SEO, it is time to know how much it has evolved during these years. Just like how we were born from a single cell, SEO has evolved from a single-celled organism to complex multicellular organism within two decades. In these years, they went through algorithmic changes. 
Evolution of SEO

1. Content-Specific

In the initial years, Google search engines was Content-Specific. When a website has more specific keywords in its content compared to its competitor which has less, the one with more keyword would rank more in the search results. For example, assume there are two companies "A" & "B". Both companies are freelance companies and their specific keyword as "Freelance SEO". A has more of that keyword in its content than B, which would ultimately result in "A" as the winner in search rankings. To take advantage of this algorithm, webmasters would use an unethical tactic called Keyword-Stuffing. It is a black-hat (unethical) SEO technique for ranking a webpage in Google search engine. Webmasters would stuff their website with specific keywords and hide them with background colorings so that, users can't see it and their webpage will be displayed as the best. However, this means that they only focused on the rankings and didn't do any On-page optimization. This is also called On-site optimization and it is a part of SEO where many other areas of webpages are optimized. In a nutshell, just compare the website of Apple and any other random phone websites which are not optimized in 3rd result of Google search. Which one would you choose as better and which one would you believe? Hope that cleared your doubt regarding the bad search results and how it impacted users.

Google Content specific


2. Link-Specific 

Now that content-specific algorithm have started to show its weakness, Google moved onto the next stage and it is called Link-Specific. As the name suggests, it works by recommendation from other websites via hyperlink. For example, let's take the company "A" and "B" as an example again. Company "A" has two websites which recommended their site while "B" has five websites recommending their website. So, in this war, Company B takes the throne against A. Basically, more recommendation from other websites leads to more recommendation vote and higher rankings in the search result. But as time passed, users found loopholes in this algorithm and began exploiting it. Instead of stuffing keywords this time, users began buying links from other webpages and the search results began having low-quality results again.
Google Link Specific


3. Quality Link-Specific

This is similar to link-specific except you can't get recommended by any webpages. The webpage which recommends your link should have good rank in PageRank. It is another algorithmic change bought by Google where the quality of a website can be found in PageRank and what rank it holds. PageRank had a rank of 0-10 and higher the rank, more trustable the website is to the user. Now you must be thinking that this is the end of trouble for Google, right? Well, long story short, users have started to get links from page-ranked websites and this became a never-ending headache for Google which they couldn't get under control.

4. Passing the Juice

Now this is the next algorithmic change brought in by Google to fix their never-ending headaches of exploitation. In this case, a value or equity is passed from one page or site to another site when they recommend it. Let me explain this with an analogy to make this simpler. 

Assume that in Earth, when two people is born(value for the website) every sec, one person dies(passing of value). Now, this makes sure that Earth always has decent population of our species. But what if it was the opposite? If Earth had one person being born and at the same time, two people dying, it would lead to extinction of our species. 

Now, link that analogy with the website. If a page-ranked website recommends another website, they are giving away a part of their equity or value to that website. Now think what would happen if the same page-ranked website gave too many links. Just as you guessed, the page-ranked website would lose its current ranking in the page-rank because they gave more value to other websites than they can generate. 
What if a website wanted to post a reference link to another website to give credit or simply to give more information on a said product? What if you don't want to give away your value or equity in the above case?  In order to achieve this, when webmasters are making the content, they add a small part called ref="nofollow" to the Anchor tag in HTML. Since we are not deep-diving into HTML here, I will just post two different ways of anchor tag. First one is the normal hyperlinking method while the second one prevents the passing of value or equity.

<a href = "https://google.com"> </a> 
<a href = "https://google.com" ref="nofollow" > </a>

Introduction of Google AdWords and AdSense

In October 23, 2000, Google introduced AdWords to our world. It is an online advertising platform that allows businesses to bid for advertising space in Google search results to show advertisements, product listings, videos, etc. 
Introduction of Google AdWords and AdSense


On June 18, 2003, Google launched AdSense. Now, you must be wondering what is the difference? Simply put, AdSense is a system that allows publishers and website owners to sell ad space in their website. In other words, you have to pay to use AdWords, but AdSense could earn you money.

After the success of AdSense, Google was paranoid about changing their algorithm more than ever because it would affect their revenue source. Assume Google earns $500 from a website while a certain percentage goes to the owner, if their algorithmic changes caused it to drop in rankings, it would decrease its income from that website. So, the years of 2003-2007 was very tough for Google. 

2008-2010

In 2008, Google introduced the personalized search results. Search personalization adds a bias to user's search queries. If a user has a particular set of interests or internet history and uses the web to research a controversial issue, the user's search results will reflect that. 

In 2009, website ranking was dependent on user interaction. If a user were to spend more time in a website instead of immediately exiting, it would get a better ranking in page-ranking. So, this forced webmasters to optimize their website for every device to greater lengths. User interaction data with the website were collected to give it a ranking.

Bounce rate is single-page sessions divided by all sessions, or the percentage of all sessions on your site in which users viewed only a single page and triggered only a single request to the Analytics server. When user interaction data is being collected, bounce rate is part of the data. Greater the bounce rate, lesser is the website ranking and trust between the website and user.
Social media rankings


In 2010, Google began ranking webpages with inclusion of Social Media Ranking. This was called Social Media Signal. If a company or website had made their social media pages in various platforms, they had a higher chance of getting more user interaction and hence, better ranking overall. This can mean, sharing of the content, liking it or interaction in the comment section of any posts from their social media page.






Comments

Popular Posts