As I have worked with numerous websites throughout my career as a professional SEO, I have seen many different problems which have caused websites to not be able to rank well in the search engines. The following problems are the five biggest mistakes that I have seen made over and over again.
Blocking the Search Engine Crawlers
The Robots.txt file is a file which contains instructions for search engine crawlers telling them where they can and can’t go on a specific website. Crawlers look for this file when they first hit the website. You can prevent crawlers from visiting certain pages or folders on your website by using the Robots Exclusion Protocol in the Robots.txt file. The error occurs when the webmaster accidently blocks the root folder of the website in this file and prevents the crawlers from crawling the entire website. If your Robots.txt file has a line which looks like Disallow: / then you are blocking your entire website.
Too Much Flash
Flash can take a plain website and make it into an extraordinary one. Complete sites created in Flash can contain videos, images, animations and other features to result in a fantastic user experience. The downside to it all is the type of crawler experience it gives. Because all the elements of a website created in Flash are contained in the video file, those elements are not visible to search engine crawlers. Content and links in Flash do not exist to search engine crawlers. If your website is built in Flash, consider moving some elements outside of the Flash video. Include content on your website and some type of HTML navigation to help the crawlers navigate your website and know what each of the pages is about.
301 Redirects and Canonical Tags
301 redirects are used to tell crawlers that a page has been permanently moved to a new one. Canonical tags are used to tell crawlers that out of a series of similar pages, one specific page should be included in the search results. Canonical tags should not be used as 301 redirects. When not used properly, canonical tags can prevent proper indexing of a website. If you are using canonical tags you should evaluate what they are being used for. Does your website have multiple categories generated dynamically by the same script, which are all similar to each other with the only difference being the products that are displayed? Do you have pages on your website which can be displayed with several different URLs? Both of these situations would be ideal for using canonical tags.
Duplicate Content or No Content
Good quality content is critical to ranking well. Google has stressed the importance of having lots of good quality content on a website and their focus on content was made even more apparent with the Farmer / Panda Update. Crawlers rely on content to determine what a page should rank for in the search results. Because Google wants to give the best user experience, they are making a big push to show only websites with high quality content. Duplicate content, shallow or poorly written content was the focus of the Farmer / Panda Update. Sites that had problems with content quality saw a drop in rankings when the update was pushed live last month. Look at the content on your website. Do you have enough of it? Is it original or was it copied from another website? Expand on your content where the webpage has very little and write your own content for pages where it is copied from another source.