• Home
  • /
  • Blog
  • /
  • Google SEO: Penguin updates and its effects

Share this

Google SEO: Penguin updates and its effects

Google SEO: Penguin updates and its effectsAs general awareness about internet began to grow, it grew from a military application to a global worldwide phenomenon. Archie (regarded as the first Internet search engine) came into existence, and it was the first of many that would soon follow. Along the way, many major companies like Yahoo bought out competitors. Did you know Yahoo is currently the oldest surviving search engine? Just a couple of years back this title would have gone to AltaVista, which got shut down in 2013.

With modern web based search engines, SEO came into existence, as being ranked at the top of a search result on a search engine could improve your overall revenues. Since Google now controls about 2/3 of overall search traffic, we will use Google as our case study.

Rise of Black Hat SEO in Page Rank System

Back door and black hat tactics have always been a part of marketing. Some are harmless, but most of them run counter to consumer interests. This is what happened with Google and other search engines, so it was only natural that a series of updates were released over the next decade, which would curb this habit. After major manipulations in the first year of release of Page Rank, Google strengthened it periodically with layers which eventually lead to the first Penguin release in 2012. This release shook up the industry quite a bit. Some top ranking major sites were blacklisted; yes we are referring to you, J C Penney. Many others were also adversely affected.

Purpose of SEO and Keyword Densities

For our new readers, we would like to point out that the major reason for SEO (Search Engine Optimization) is to drive the maximum amount of traffic to your website. This can lead to increased business opportunities or increased revenue from advertisements. SEO was initially achieved using keyword densities and anchor texts, but a lot of spamming was involved, and keyword densities have lost their central place in rankings.

Many internet users will remember a time when searching for a particular term used to pull up pages which were useless for humans and only contained an intelligible jumble of words repeated over and over again. Even now sometimes pages come up that are high in keyword densities and have little else of benefit.

Introduction of Link Analysis and Link Quality

This led to Google introducing Link Analysis in its Algorithms. Very simply put, this new criteria ranked websites with the most links back to them from other sites as higher. This was also manipulated heavily and subsequently Google started analyzing the quality of the links. This lead to the release of complex updates like Google Panda (2011), which were aimed at curbing spam-using tactics. This eventually led to the release of Penguin 1.0 update in April 2012, which was followed by successive updates in the coming months.

Penguin 1.0 – < 2.0

Two main things were hit by Google in these Updates: over-optimized anchor texts and irrelevant/ bad neighbor backlinks. Anchor text is the text in your link which helps Google understand what your site is about. However, these links were spammed with keywords to contribute to keyword density. This was highly targeted by Penguin 1.0, as over-optimized anchor text profiles started to get penalized. Here’s an example: you have a home and décor store selling a range of stuff. If your main product is bedding, and you use it in all your anchor texts, then, chances are, you will get penalized.

The irrelevant link is much simpler. If you own a specialty store selling fashion products, and if your links are coming from niche such as theoretical physics, then your back links are considered to be of low quality and will get penalized. The most appropriate links for you would come from fashion blogs and other fashion websites. Bad neighbor links come from adult websites.

Penguin 2.0 – 3.0

Penguin 2.0 went much deeper into the link quality analysis. It affected the inner pages of your website as well, compared to only the home page of your site analysis done by 1.0. All these measures were taken to stop link spamming. This would then automatically lead to high-quality content websites being ranked higher.

Black hat techniques like article spinning and tiered link building became very difficult to maintain. Article spinning is when an original article is taken and then passed through softwares that use spintax. Spintax is a directory of synonyms and phrases. The software uses this to replace the original words so that the new article is plagiarism-free. Although new articles are produced this way, the content is of very low quality and most times not even readable by humans. It does, however, allow SEOs to use them for links.

Tiered link building is building multiple sites around your site, just for providing backlinks. These engineered sites do not have anything else and only serve the purpose of providing back links via poor quality and barely human-readable content. Link Networks were also hit very badly in these updates. Other similar spam methods which became difficult were comments-link spam, forum spam, and blog spam.

But, what about sites which have been hit by these updates and have subsequently removed the bad links? This is where the data refreshes come in. Each update also refreshes the data and then gradually restores the sites ranking according to the new filters.

Penguin 4.0 and the Future

Penguin 4.0 was to be launched in late October 2015; however, it could not be rolled out in time and is now scheduled for next year. It will continue to crack down on spam and promote quality content.

There are many other factors which affect the page ranking in search results, but links are still a very important factor. These updates have already caused a reduction in successful manipulation by SEOs using Black Hat tactics and they will continue to do so in the future as well. So, if you are an SEO or a company CEO, it would pay off in the long run if you were to focus on quality content and hire SEOs which do not use these shady practices. Too many spammy websites featured in the top results of Google searches would result in less people using it, and Google is definitely not going to hurt its own interests.

Thank you Kayla Ethan for this guest post! She works at Rebates zone.

Loved this? Spread the word


About the Author

Spencer helps you save time through teaching digital marketing and social media strategies in plain English, after proving they actually work for himself and his company AmpliPhi first. He also is an instructor at the University of Wisconsin and Rutgers University.

Spencer X Smith

Related posts

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
>