<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=763991110377089&amp;ev=PageView&amp;noscript=1">

With all this discussion about paid links and outing SEOs, I don’t jump on the outcry bandwagon. I try to anticipate Google’s next move. I want to focus on future-proofing our campaigns and these are great times to reconsider what Google is going to do next.

In 2005, Matt Cutts wrote about how Google considered paid links to be “outside” their guidelines. He specified “selling links muddies the quality of link-based reputation and makes it harder for many search engines (not just Google) to return relevant results.” Cutts goes on to explain that buying “links purely for visitor click traffic” is best done with the nofollow command in the rel attribute.

Meme from someecards.com

Fine. So paid links are okay but only when the rel=nofollow attribute is present. That’s pretty clear. But what about legitimate paid links from the past where you can’t add the new nofollow command? What about changes to how non-paid links are viewed? Are you getting penalized because of previous efforts? According to Google, yes!

As an aside: a paid link is unacceptable but a sponsored link isn’t. I understand the difference, but it’s a fine line at best.

The Cloak of Invisibility

My greater concern is with the ever-changing invisible line of the organic algorithms. What’s acceptable today isn’t allowable tomorrow and there’s no way to correct 100% of your actions after the fact. Wouldn’t it be great if we could disassociate ourselves from previous links once that invisible line moves? I say, yes! Consider when negative SEO is performed against your site. What then?

The cloak of invisibility
Image Credit: thebuglish.com

Both the Google Panda and Google Penguin updates focus on removing pages that break quality guidelines. If your page is working to score better in the rankings by cutting corners, your strategy is a ticking time bomb.

If you’re marketing your website with a marketer’s mind, looking for every opportunity to bring in new customers while strengthening your brand, you may not realize the next guideline addition will nix all of your efforts retroactively.

Should we stop optimizing then? Do we publish content without any care as to the format or the organization of the content elements? Of course not. Start with quality optimization tips and pivot when necessary.

Some in our industry argue that “you should’ve known this was coming” when in fact some of the unacceptable practices today were a simple transition of traditional marketing to online marketing. It’s not outside the bounds of building a strong brand or increasing your visibility. It was done with good intentions!

Google’s marketing team practices many of the same things as other marketers when promoting their hundreds of brands, cross-linking sites (laugh now and cry later), sponsoring content, placing ads, etc. Nobody would tell Google’s team not to do this. It’s good for their visibility, credibility, and revenues.

Impossible Labyrinth or Quality Guidelines?

I’m not expecting Google to detail every way to win in online marketing. But there’s still a huge gap between allowable techniques and webmaster guidelines.

Image Credit: Labyrinth

Because the Google algorithm is closed, SEOs have to figure out (test) against the search engines until they get what they want. As we do this, we blog about our findings and often share them with the community as “the perfect way” to rank in the search engines. But there are going to be incorrect findings along the way, or things that worked for a particular website or keyword or index. We’re going to make mistakes. We’re going to misinterpret our findings. We’re going to make some assumptions. Does that make us blackhat marketers? Nay.

Words like “always” and “never” have driven great blog posts into the fabric of our community as much as “I tried” and “you should test” and SEOs have relied upon the best information at the time to rank websites and coach business owners how to rank their websites. Most of the time, this information is valuable and improves the user experience.

Don’t misunderstand, I like algorithm changes. They eventually make search results better for users. Smarter algorithms with segmented data make for quicker, more personalized results, and greater relevance. Then search engines overhaul their algorithm and knock a portion of the garbage out of their index.

All I ask is that we have a way to remove ourselves from the mistakes of our past!

I’m not alone.

As time goes on, Google continues to measure credibility by tying attributes back to Google properties. Google+ is an example of this and the more connected our content is to our Google+ profile, the better chance we have of becoming an authority on the subjects we write about.

Google tells us to write great content and the rest will follow. Not true. The distinction of quality content and over-optimized content is vague at best. There are web standards that are good for users, software developers, search engines, and many others. But Google begins to break down those standards when they tell us to avoid using keywords in our title, headers, and body copy. I’m being a little sensitive here on purpose. I think Google is too vague with statements like these.


The Karate Kid

Working in SEO is very much about balancing the art & science of our efforts. Much like Danielson fending for himself, online marketers have been disadvantaged with organic visibility since Google started earning billions through Adwords.

The scientific mind in SEO believes it’s true until proven untrue. The artistic mind in SEO is where we get creative in finding more customers and drawing them into your site. This may include trading links with vendors, buy advertising, guest blogging, or any other way in which we build up our credibility. Sometimes we push the limit and have to pull back.

It’s the invisible line that gets me.

Social media is Google’s next attempt to avoid webspam. It centers around author rank or the credibility of the individual behind the content. Unfortunately, there are billions of web pages that don’t warrant an author behind it and therefore can’t be held up to the same standard. But when a credible author writes the same content, they will have the chance to outrank a better source because of the credibility of the author and not the quality of the content.

Penguins & Pandas

Google Algorithm UpdatesThe latest Google Penguin updates have allowed a considerable amount of irrelevant pages on Google properties (notably Blogspot blogs now removed) as well as sites from Poland, the UK, and other country specific domains. Although it’s not technically webspam, it’s still garbage to me.

For now, they’ve introduced less relevant results: .co.uk site in my search for “SEO” from Bluffdale, Utah. I believe there may be something about the acceptance of gTLDs by iCANN last summer that Google is testing how to bring worldwide domains into the .com index. Or maybe it’s a glitch. Google is working to figure out how to introduce content from many of the different segmented data they have available from the knowledge graph.

Outing & Public Executions

I don’t like outing (in most cases) because marketers are often left to fend for themselves online. There are thousands of vague guidelines and a few very specific. Marketers are creative and find new ways to reach their customers, rely on traditional tactics, test new techniques, expand channels in which they can reach out, improve customer relations, and generally push the limits. Once they find something that strikes a chord with their audience, they maximize it.

Monty Python

There are those who participate in true blackhat tactics such as exploiting software and loading 1000 links on your site (true story bro) or loading malicious code on a website for the purposes of stealing someone’s identity. These are the true exceptions that deserve an outing and/or report to Google.

Besides, outing others is a tactic that’s as easily abused as the actions you’re reporting to Google. Not only that but why should we support Google in crowdsourcing this valuable information so they can improve their revenues at your expense?

Did I miss anything? I’d love to hear your thoughts.