Why Google Doesn’t Devalue Spam Links?

I recently read this post over at HungryPiranha – An Open Letter to Google’s Matt Cutts: On Penalties & the New Link Disavow Tool, that coupled with some of the comments I have been reading has lead me to write this post.

DISCLAIMER: What you read here are my thoughts on what I have seen, analyzed and observed.

It seems so simple, devalue instead of penalize.  For us to think that Google doesn’t do this or that this is some profound revelation is really naive thinking.  

What I think instead is that Google has been doing this on a smaller scale and this approach worked fine for Google until recently.  

ONCE UPON A TIME…

Enter couple years back – the SERPs world was in a so called state of equilibrium, you had the White hats who were following all the guidelines and were ranking well and then you had the Black hats ranking with spam, but their was a level of sophistication involved, the barrier to entry was high either in terms of knowledge or money.  

Then things changed, in the past 2 to 3 years there was a huge upsurge in SERP manipulation and this was because of a new class of people.  The Wanna be SEOs.  It was fueled by the extremely low barrier of entry for creating a site and getting it ranked on Google using mass and cheap spam techniques and by so called gurus selling easily accessible and cheap courses to achieve this.

Hell, that’s how I got into it, I still remember I would manually create profile links, submit articles, and the sites would kick ass.  I was like wow it’s easy to make money online. And I admit, I didn’t use the best content, used templated websites, and mass produced them.

Now this worked extremely well for sometime, and I would see other sites doing the same thing to rank all over the place, similar templated theme, thin content, no/crappy header, low quality links etc., I could spot these sites from a mile away.  Of course they evolved.  But the bottom line was that those crappy, easy to create spam links worked.

And I knew, if I saw them, Google HAD to have already noticed them.  You also had the White hats who had started noticing and complaining- “I followed all  your guidelines to build this amazing site about ear hair removal and you rank this garbage over mine?  Are you F***ING kidding me?”

CRITICAL MASS

This continued for a while and then it reached critical mass in 2011, when we got our first dose of real penalties and not just shuffling of positions.  

Panda dealt with on-page quality and it would downgrade sites that were found to be low-quality or mainly because the user experience sucked.  

Then the Penguin update came and on-page quality was not even a factor?!  What?  To me, in the beginning it was unfathomable, WHY would you penalize a site just because it had some spam links pointing to it but all other signals were good, and more importantly if the user experience was fine.  Why not just devalue the spam links and let those sites fall under the sites that have a better link profile?

With Penguin, it seemed that they were willing to sacrifice “quality” or diversity of the search results.  But why?

At Pubcon, Danny Sullivan straight up asked this question to Matt Cutts, and seems like he didn’t provide an answer:

Why Matt Why

THE ANSWER

Here’s my answer: it’s because they cannot FULLY and ACCURATELY identify and devalue spam links without it making their results go to crap.

Hear me out, and I have read this countless times, you will see people saying, oh if they only devalued comment links, and they devalued footer links and they devalued irrelevant links, and they devalued link networks and they devalued article directories etc. etc. it would be so much better.

Do you really think Google has not tried this out?  When we make such assumptions, we are only looking at an EXTREMELY small sample of sites and that sample is not even random, we have NO IDEA how the SERPs would look if they actually implemented these types of mass devaluations.

For me the simplest explanation is that Google has tested this and found that the results are worse or not really better if they were to apply this model.  You have to give Google some credit, they have some of the most brilliant minds working for them and you really think they have not thought and or tested this?!

If spam could be identified and devalued as easily as thought possible, then why are there so many inconsistencies in the results, you still see sites ranking using the same crappy methods that were used by a bunch of sites that were hit.  

All the updates done so far, look for spam signals and blatant manipulation (like heavy commercial anchor text etc.), but they don’t actually identify and devalue spam links completely.  Now obviously some very blatant ones with easily identifiable footprints are removed (BMR, co.cc, sitewide widget links, profile links? etc.) either manually or algorithmically but vast majority survive.

So if they couldn’t straight out devalue spam, what did they accomplish with Penguin & the other recent penalty type updates?  Frickin FUD.  

Google was willing to sacrifice short term “quality” or diversity for better results in the long term.  Now it’s important to note that this is our view point.  Search quality is relative, outside of our community, the general public doesn’t notice these intricacies in the results, as long as they see the big brands and the usual suspects ranking, they are happy.  Yes they may not see the very enthusiastic small guy who created his site about repairing Kayaks, but as long as they get that information in some form they couldn’t care less.  And these users make up the majority of the audience.  

SHOCK AND AWE

This is what Google accomplished:

White Hats: ”I told you so.  Be as pure and virgin as us and thou shall survive.”

Wanna be SEOs: ”OMGZorrz!? I lost all my sites, I have learned my lesson.  I will be true and pure from now oh all venerable Google.  I will give away my first child to create quality content.”

Gray Hats: ”Meh, adjust or die.”

Black Hats: ”Thank you Google for the less competition from Wannabes.”

You see, the point of the updates were to shake the so called Wanna be SEOs which have been the major source of MASS spam that has littered the SERPs recently.  Of course there was collateral damage, but that was within acceptable limits, no wars have been fought without casualties.  

Now the article above mentions: Google is so focused on fighting spam that they are not looking at the collateral damage.  I would again give Google the benefit of the doubt, it’s not that they don’t see, of course they do, (when we can have tools like MozCast that show diversity in the domain, you think Google doesn’t have something much more sophisticated to monitor that), it’s that they don’t care as much because:

  1. Search quality is very subjective and there is no right and wrong. 
  2. They are willing to sacrifice sites / some diversity (short term) to reach that golden equilibrium.
  3. Most importantly, the user metrics and whatever their internal KPIs they use to monitor user interaction with their SERPs remains the same or improves after these updates.

For the casualties, Google may make some attempts like that huge Panda thread, or link confessions or the recently implemented Disavow tool so that they can appease them to some extent.

CONCLUSION

If you think spam doesn’t work, and or relevancy is the new PR, or that EMDs are done for, or footer links are completely useless or directories are the devil’s spawn etc. etc. well then Google has been successful in what they set out to accomplish, doesn’t matter the means they used.  

They are trying to reach that equilibrium again – the one where only a small percentage of people are able to manipulate the results in mass scale by breaking their guidelines.  

Now, I am not saying that Google will never totally devalue or never be able to conquer spam, I just feel that they are not at that point today and that it may not even be worth their while as long as they can keep the barrier to entry high.

TAKEAWAY

So what’s the takeaway? Sadly it’s nothing profound or new but it bears repeating: BE A SKEPTIC BASTARD, don’t believe all you read and please DON’T only read what you believe.  Think for yourself and Test, Test and Test.  

I am not saying find new ways to spam or manipulate the results, but just don’t blindly believe everything and think that Google is that clear cut or black and white.

I will end with: Meh, adjust or die.

Links in Ad Copy – New AdWords Experiment?

Google has been fairly aggressive in the past few months in rolling out changes with the layouts of their ads.  Some major ones include changing “Sponsored Links” to “Ad”, displaying URLs as lowercase, adding description to the title, adding URL to the title for top ads etc.  They seem to be continuously experimenting with ways to increase the CTRs for their advertisers.

Today I noticed an ad format I haven’t seen before, as you can see in the screenshot  below it seems like their is a hyperlink within the ad copy.  This link leads to the same landing page as the title.  The format reminds me of Text Link Ads.

AdWords-inlinks

Welcome to SEMMetric.org

I welcome you to the launch of this blog, it is very bare bones for now but I will try to post at least once a week.

I work as an in-house consultant handling all of their internet marketing and all the other extra responsibilities that come along with working for a small business.

I will be posting mainly about my experience with internet marketing, covering topics like pay-per-click, search engine optimization, conversion rate optimization, analytics etc.  I think in this field having in depth knowledge about all the various steps from researching to acquiring to selling to converting and even reconverting is vital.

So read on and enjoy your stay.