5 Site Optimization Blunders that Will Kill Your Rankings

As I have worked with numerous websites throughout my career as a professional SEO, I have seen many different problems which have caused websites to not be able to rank well in the search engines. The following problems are the five biggest mistakes that I have seen made over and over again.

Blocking the Search Engine Crawlers

The Robots.txt file is a file which contains instructions for search engine crawlers telling them where they can and can’t go on a specific website. Crawlers look for this file when they first hit the website. You can prevent crawlers from visiting certain pages or folders on your website by using the Robots Exclusion Protocol in the Robots.txt file. The error occurs when the webmaster accidently blocks the root folder of the website in this file and prevents the crawlers from crawling the entire website. If your Robots.txt file has a line which looks like Disallow: / then you are blocking your entire website.

JavaScript Navigation

Many sites use JavaScript to create drop-down, accordion and other styles of navigation. These types of navigation can help make it very easy for visitors to navigate large websites. However, for search engine crawlers it can look much different. The problem with JavaScript is that while there is a fully functional menu for website visitors, there are no links in the actual source code. Search engine crawlers rely on links in the code to navigate the website. Disable JavaScript in your web browser and then look at your website. If you can’t see the site navigation then crawlers won’t see it either.

Too Much Flash

Flash can take a plain website and make it into an extraordinary one. Complete sites created in Flash can contain videos, images, animations and other features to result in a fantastic user experience. The downside to it all is the type of crawler experience it gives. Because all the elements of a website created in Flash are contained in the video file, those elements are not visible to search engine crawlers. Content and links in Flash do not exist to search engine crawlers. If your website is built in Flash, consider moving some elements outside of the Flash video. Include content on your website and some type of HTML navigation to help the crawlers navigate your website and know what each of the pages is about.

301 Redirects and Canonical Tags

301 redirects are used to tell crawlers that a page has been permanently moved to a new one. Canonical tags are used to tell crawlers that out of a series of similar pages, one specific page should be included in the search results. Canonical tags should not be used as 301 redirects. When not used properly, canonical tags can prevent proper indexing of a website. If you are using canonical tags you should evaluate what they are being used for. Does your website have multiple categories generated dynamically by the same script, which are all similar to each other with the only difference being the products that are displayed? Do you have pages on your website which can be displayed with several different URLs? Both of these situations would be ideal for using canonical tags.

Duplicate Content or No Content

Good quality content is critical to ranking well. Google has stressed the importance of having lots of good quality content on a website and their focus on content was made even more apparent with the Farmer / Panda Update. Crawlers rely on content to determine what a page should rank for in the search results. Because Google wants to give the best user experience, they are making a big push to show only websites with high quality content. Duplicate content, shallow or poorly written content was the focus of the Farmer / Panda Update. Sites that had problems with content quality saw a drop in rankings when the update was pushed live last month. Look at the content on your website. Do you have enough of it? Is it original or was it copied from another website? Expand on your content where the webpage has very little and write your own content for pages where it is copied from another source.

Get Internet Marketing Insight For Your Company - SEO.com


  1. Paul Salmon says

    For many people that use Javascript menus, they may not be aware of the implications when it comes to SEO. Since they can see the links on the screen, they think that Google and other search engines can also see them, which is not true, as you pointed out. An alternative is using CSS for the menus since the links are in the source code.

    As for Flash, I was never a big fan of those even as a visitor. They may look good, but they can be a pain to navigate and to optimize.

    • Dustin Williams says

      I agree. Many people look at their website and think that if they can see all the links then so can the search engines. Same problem exists with Flash. Only with Flash, every element of the video will be invisible to search engine crawlers.

  2. Maciej says

    I still don’t understand why people want to build all flash websites. I come across conversations from time to time where certain parts of the business community still think a flash website is “cool”.

  3. annie says

    i agree from your blog. great collection and infromation in this blog. thanks for providing this video also. From the video the things are better to be understood. the things are easily understandable.thank you for this valuable blog

  4. says

    “Disable JavaScript in your web browser and then look at your website. If you can’t see the site navigation then crawlers won’t see it either.”

    This is not true. If javascript is rendering the nav, SE’s cannot crawl them, however if javascript is used to display nav items, the spiders can certainly crawl those links.

Leave a Reply

Your email address will not be published. Required fields are marked *