After Matt Cutts, recently hinted that overly optimized sites can draw penalty, everybody started wondering what that would exactly mean? Where to draw a line between optimization and over-optimization? How exactly could Google point out which is optimized site and which is overly optimized?
Let’s get into the depth of the situation and analyse what could Google consider as over-optimization:
Everybody knows Google loves unique content and specifically readable unique content which adds value to the user. Secondly, we also know that Google doesn’t love thin content and content directed specifically at search engines. With Google having the ability to find content with “SynonymsÂ forÂ Uniqueness”Â , there is much less scope for on-site optimization left apart from increasing the value of the content on the website. This does mean that the content on your site would no more require a particular keyword to repeat X number of times for optimization. If the content is related, Google would identify it with the relativity and synonym factors.
Having said that, what would really trigger the over-optimization alarm would be the offsite link building factors. We know that, Â Google already has in place several checks to identify link farms, massive low quality links. Â But, over-optimization can also be identified with the speed by which a site gets backlinks. For several years, Search Engine Optimizers slowed down their optimization work to avoid penalty, however, now they not only need to slow down but also would require to use a combination of link building factors to escape this over optimization filter.
While planning a link building strategy for a particular site, following things should be noted:
1. The niche of the site and the average link building speed of sites in the same niche. Example: An entertainment related niche which has lot of social presence will have the ability to call huge backlinks in a natural manner than a site selling a heavy machinery. If a site selling heavy machinery, is getting lots of links through social networking and looks like viral then that would trigger the over optimization policy.
2. Huge backlinks to every page of a website can also trigger over optimization. Any website which has natural links will not have the same number of backlinks for every page . The backlinks counts will vary a lot. If the backlinks are not consistent with the content quality on that page and are focussing on specifically a few pages, then that’s a red flag too.
3. High Internal linking: Â Internal linking was an important strategy in the last decade for SEO’s. However, now if you link to content internally which does not force the user to read and which purely looks like an SEO strategy, that would call for the over optimization trigger. Example if you are linking to the same page with the anchor text or using an anchor text instead of HOME button. Â Internal linking should be done at places which would add to the reader’s value. Atleast one in 10 readers should follow that internal link.
Conclusion: Googlebots are getting smarter and Search Engine Optimizers will have to work harder and work with the same motive which Google has, focussing more on relativity, content quality and natural links along with Social media presence and marketing. Â This over-optimization filter will add more headache to webmasters who are already worried about the Google panda update.