We Can Disallow Pages, Why Can’t We Disallow Links?

Link building is a critical part of most SEO campaigns. If you have a solid strategy to obtain legitimate links from relevant websites, you’re adding value to the web and typically, raising your rankings within the search engines in the process. On the other hand, if you’re link building to sketchy websites for the sheer purpose of manipulating the search engines without providing any value, it’s easy for the search engines to see your shady tactics and adjust your rankings accordingly.

However, what happens when you’re a legitimate business that unknowingly hires a spammy SEO company? Or, what if your competitors decide to sabotage your rankings by throwing some money and/or resources at a smear campaign, filled with massive amounts of unnatural links? Does a business owner’s website deserve the“Google slap” when they had no intention of violating Google’s policies?

Personally, I have received an over-optimization penalty from the search engines because of my own stupidity. It was one of those “life lessons” that I deserved because I knew better, but applying the same consequences to all webmasters is borderline unfair.

For example, I currently have a client who has had prior SEO campaigns from illegitimate, overseas companies that did more harm than good. Those companies got him 340,000 external links over the span of 18 months in the footer and sidebar of 90 different domains using their main keywords. They also had keywords with Chinese and other Asian characters. Oh, I almost forgot, there were porn sites in there too. Did this trusting website owner deserve a penalty from Google because his SEO company is engaging in wrongful strategies?

If you had an established company and you saw a competitor quickly coming up in the SERPs, how far would you go to stop them? After all, you want to stay at the top and dominate your niche. If you were experienced enough to understand that building an unnatural link profile to a website could result in penalties, would you? I wouldn’t, but that’s because I’m an honest person. I’m willing to bet, however, that most people would try and destroy their competitor’s rankings given the chance.

Google: Help us, Help you

This isn’t the first time somebody has suggested Google allow us to disassociate our websites with another linking website. Dr. Pete from SEOmoz already gave his modest proposal in an excellent blog post on 6 ways to recover from bad links. I’m sure others have covered it as well.

There are a few ways Google could allow us to help them combat webspam. One excellent way is to create a robots-only file on your website similar to a robots.txt, where you can specify what websites you want to disallow.

If that method doesn’t work, it could always be built into Google Webmaster Tools. This concept isn’t complex and I can’t imagine it being difficult for the search engines to execute, but it definitely would protect business owners from shady SEO firms and prevent unethical competitors from sabotaging your website.

Dr. Pete already did an excellent job with his suggestion, but if this blog post has any purpose, it’s to raise awareness of this problem. Being able to disassociate incoming links at the very least, would raise the standards and practices of SEO companies and prevent oblivious business owners from getting screwed.

To me, it would make sense for Google to allow webmasters to do this. At the very minimum, it would allow inexperienced site owners to quickly recover their losses after being duped by an illegitimate SEO firm. It would also protect new websites from being sabotaged by big competitors. However, what reasons do you think Google has for not doing this?

 

Get Internet Marketing Insight For Your Company - SEO.com

13 Comments

  1. says

    It really is a great idea. I think that a robots.txt style solution would be best since that’s something that all search engines could use, but I wouldn’t be surprised if eventually GWT came out with a tool like this.

  2. Ryan Jones says

    I think if Google allowed this, it would be a huge win for spammers everywhere.

    Think about it. What would stop me from firing up xrumer and spamming the hell out of the internet or launching a giant hidden link campaign through some widget? Assuming I got penalized I’d simply take my list of places I spammed, login to webmaster tools and say “hey, these weren’t me” and boom – penalty lifted.

    There’d be no more risk associated with spamming/buying links.

    on a side note, I’m not 100% convinced Google actually penalizes for bad links anyway. That would be a mess to deal with algorithmicly. It’s way more likely that when Google finds shady links to your site they take a scalpel approach and just ignore them.

  3. says

    Good point Ryan, and when looking at it from that angle I completely agree with you. It would definitely get abused.

    I agree with Kevin in the spirit of the idea, but in this format it would definitely get abused. At this point the best Google can offer is a reconsideration request and for now it might be one of the best ways to go even though the process of it sucks.

    Really in the end, this whole topic just proves that business owners need to be more educated on what their SEO is doing.

    • Ryan Jones says

      I think that’s the key. If your SEO won’t tell you exactly what he’s doing, he doesn’t deserve to be paid.

      As for the bad links, I’m half tempted to pick a site of mine and say “here, throw your worst links at it” and see what happens. My guess: ranking boost.

      • Greg Shuey says

        You don’t need to… Rand already did as he mentioned in this post: http://www.seomoz.org/blog/8-predictions-for-seo-in-2012

        “One of the major weaknesses of Google (and Bing, to be fair) is their continued over-reliance on links as an overwhelming ranking signal. Just recently, I took up a friend’s offer to point some obviously shady links from sites Google should clearly be discounting at several webpages. We saw dramatic results within 24 hours – #1-5 rankings that have sustained for several weeks (more news on this experiment to come). This shouldn’t be the case and Google’s webspam and search quality teams know it.”

  4. Andrew Walsh says

    I was thinking Google would be reluctant because this same strategy could be used to spam your own sites for rankings and then bail yourself out if you get caught. (I see Ryan beat me to it and answered much more knowledgeably!)

    But I’m curious about the statistics relating to these honest businesses who are conned out by shady SEOs. I know this happens, but have no idea of how often.

    • Ryan Jones says

      It happens often. It just happened to Google, it’s happened to JC Penney, BMW, etc. It’s often the case of companies not understanding the true value of SEO and going with the cheapest offer – or asking their agency who doesn’t do SEO to do SEO and the agency promising what they can’t deliver.

  5. Kevin W. Phelps says

    I’m a firm believer that there are certain triggers that go off with Google when their is clearly an unnatural look to your link profile. It’s happened to me and it’s happened to my clients.

    Google’s solution is the reconsideration request but I’ve never actually seen any of those pull through and restore any part of a clients’ website.

    I think that you’re all right, this method of disallowing links would probably get taken advantage of by spammers but I would hope that Google eventually finds a way to protect business owners from crappy SEO companies. Not everybody can see through the BS that some SEO’s sell them, that’s where a disallow link file would be helpful.

    After thinking about it Andrew Walsh is probably right in the fact that although it would suck to get scammed by a shady SEO firm, I wonder how often it actually happens. And, I wonder how often shady firms actually cause significant damage and/or penalties. The lack of work from a shady SEO is probably more common than an SEO going crazy with link building.

  6. Ash Buckles says

    Some really great points have been made here. I like the idea of a solution in the event of a penalty based on 3rd party tactics that harm a websites rankings. It’s been largely controlled thus far by Google trying to attach different weights to links; making all links count and therefore many spammy techniques still increase rankings.

    A couple thoughts:

    1. Webmaster Tools (or somewhere private) would be best for site owners to list URLs they would like to disassociate. I could see very bad things coming from this information going public through a robots.txt-type file. For example: if 1000 websites all requested to disassociate from a domain (URL), Google could incorrectly use this as an indicator that the domain is shady. Just one example.

    2. Most websites are not penalized for link building efforts. There are exceptions. Pages are often penalized from link building but usually by a ranking loss of 5-20 or so positions. This can be fixed by building more links that balance the backlinks to the page.

    3. Most penalties are from poor on-page SEO. I’ve seen far more website penalized from lack of on-page attention than lack of link building (off-page) attention. Still, this is a great thread with a lot to consider.

  7. Ryan Jones says

    I think there are so few websites actually penalized for their link graph (if any) that enabling this feature will simply cause the majority of webmasters to do more harm than good.

  8. lim says

    “that most people would try and destroy their competitor’s rankings given the chance.”

    You are right, I did, not what the moral issues,just competition。

Leave a Reply