<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=763991110377089&amp;ev=PageView&amp;noscript=1">

Let’s go back…back to a more simple time. Back to when keyword data was provided in Google Analytics. Back to when we could optimize a page by stuffing the crap out of it with keywords. Back to when we could build, literally, thousands of links in a month (all with exact match anchor text). And back to a time when Google was returning search results based on keywords in the search query.

Yep, those were the days. To be successful, all you had to do, it seemed, was “out keyword stuff” or “out link build” your competition.

Remember how we were able to track the rankings for a particular keyword and see how successful it was based on the number of visits it drove? We could easily analyze and report on the data, allowing us to identify areas of opportunity with ease.

It’s amazing how things have changed!

Let’s now fast forward to the present day. We are currently living in a world of cuddly black & white birds, bears, and (not provided) data. If you have no clue what I’m talking about, allow me to explain.


The Panda algorithm hit hard in February of 2011. It is said to have affected nearly 12% of the search engine results. A lot of people felt it! This algorithm was initially designed to target and devalue sites that have thin SEO.com-webinaror low quality content, a high link-to-content ratio, and other on-site quality issues.

Over the years, there have been several updates to the Panda algorithm, all targeting a new focus. Google has provided quality guidelines for site owners to follow and have given a lot of insight into what will cause a drop in rankings. Keyword stuffing, link schemes, and other loopholes to deceive the crawlers are just a couple ‘tricks’ called out in this document.


Here is another black & white animal honored by an algorithm being named after them. Ya know… I used to think that pandas and penguins were cute.

Hit By Penguin Algorithm Update

Penguin was rolled out in April 2012. Now, while Panda targeted low-quality on-site practices, Penguin was designed to target low quality off-site practices.

Penguin was aimed to decrease the rankings of sites that consistently used tactics pointed out in Google’s Link Schemes. Some of the strategies include buying and selling links to pass page-rank, link exchanges, building links to low quality directory sites, and manufacturing links with exact match anchor text.


In August of 2013, Google updated its core algorithm. They called this update Hummingbird. Basically, this new algorithm changed the way Google is displaying search results. Rather than picking out individual keywords and returning “relevant” results, they try to translate and fully understand what the searcher is looking for. “Conversational Search” is what Google has called it. They want to provide the most relevant answers to the questions being asked by their users.

These 3 algorithms work together. Panda and Penguin are simply add-ons to the core algorithm.

Not Provided Data

Now lets get to the “not-provided” data. In October 2011, Google decided that it was going to stop displaying keyword data in Google Analytics to protect its users privacy. Essentially, if a user performs a search while logged in to their Google account, Google is not going to tell you the search string the visitor used to find your site. Lame, right?


Okay, so let’s break this down.

  • We can no longer target “exact match” terms in both our on-page optimization and our link building efforts because of Panda and Penguin.
  • With Hummingbird, Google no longer displays results based on keywords within the search. Instead they are now looking at the entire query. AND…
  • We are no longer able to report on the success of a particular keyword because Google is withholding user data from us.

So I ask: what is the point of spending so much of our time and energy targeting and focusing on a list of keywords?

I’m not saying that higher rankings aren’t a logical goal. What I am saying is that we have no clue exactly how successful a particular keyword is, so why not go after long-tail variations of a head keyword? These types of keywords are going to provide more relevant traffic anyways.

Now sure, you can get into your Google Webmaster Tools account and see how many impressions that keyword got, along with the clickthrough rate, and then go into analytics and see how many visits that page got, but that is still really, really, incomplete data that will be very time-consuming.

How do we get around this?

Watch for my next post that will cover new tactics and ways we should be measuring success. These will include:

  • Head Terms vs. Long Tail Targeting
  • Direct Value Reporting
  • Identifying wins and successes
  • Analyzing fails or areas of improvement
  • Comparing Webmaster Tools Data with Analytics data

I would love to hear your thoughts on this topic. How do you measure success? How to you target keywords? Tell me and I may use your tactics in my next post!