Common SEO Pitfalls

Unfortunately for the search engine optimization industry, there are many SEOs who have implemented spammy techniques that will now get you into trouble with the search engines. If you have had SEO work done in the past and your site is no longer ranking well, then you might want to determine if some of these tactics have been used unbeknown to you. We have also included some common issues that may have been done inadvertently.

Pitfall #1 – Site-wide text links

We recommend avoiding site-wide text links in all circumstances. If you do happen to have an advertisement on a Web site, then you should obtain a link from only one page. The link should be placed on the page that will pass the most link relevance to your landing page.

Even though there was some value from site-wide links in the past, these days you will most likely get yourself filtered for the targeted keyword. You probably won't get banned for the links (otherwise it would be too easy for competitors to sabotage your rankings), but it's generally not a good idea. You especially run the risk of being filtered if your link profile is not very large and diverse. For example, if you have 200 external links and 80% percent of them have the same anchor text, then your links look very suspect to the search engines. In those instances your site-wide link will most likely hurt more than it will help.

Pitfall #2 – Duplicate content

Duplicate content is simultaneously an SEO myth and a common pitfall. On our search engine optimization myths page, we clarified how duplicate content will not get you penalized (you simply won't get credit for it). However, that does mean you should count on getting benefit from the duplicate content. The common mistake people make is to rely on duplicate pages for SEO value. If only a small number of your pages are viewed as unique by the search engines, then you have a significant problem.

We recommend minimizing the number of duplicate pages on your site so that you can get benefit from as many of your pages as possible. You can determine if duplicate content might be a problem by searching for snippets of your content in quotes. If your pages do not appear in the search results, or if they are omitted for being too similar to other pages in the SERP, then your content is probably getting filtered.

Pitfall #3 – Hidden text

When search engine optimization first began as a marketing strategy, it was common for some sites to artificially increase their keyword relevance by hiding text on their pages. For example, you could use CSS to define page elements as hidden or text as white on a white background. Search engine spam filters have gotten so sophisticated that it is virtually impossible to benefit from such content. In fact, you'll do nothing but potentially get yourself banned from the search engines. Most people know this is a bad idea nowadays, but it's worth mentioning here as an SEO tactic to avoid.

Pitfall #4 – Cloaking

Another common spam technique that people have used in the past is cloaking. Cloakers will determine if the page is being requested by a human or search engine based on the user-agent or IP address. They will then serve different content to the search engines in order to increase the site's keyword relevancy. Like hidden content, most forms of cloaking can easily be identified by the search engines. If you are cloaking and it is clearly being done for SEO purposes, then you definitely run the risk of being banned by the search engines.

Pitfall #5 – Disallowing pages on your site

We have seen instances where site owners have inadvertently prevented their pages from being indexed. This generally occurs when developers transfer content from a test domain that is being disallowed via robots.txt. Sometimes the robots.txt file is transferred to the live site along with the rest of the files from the test site, which would then cause all of your pages to be removed from the search engine indexes. Look out for this issue if you transfer files in this way.

We have also seen instances where site owners will include a Meta Robots "none" or "noindex" on their pages. For example, sometimes WordPress blogs are installed by default with their pages noindexed. If you are wondering why certain pages on your site are not getting indexed, you might want to confirm that you are not disallowing or noindexing the pages.

In general, it is poor search engine optimization strategy to disallow or noindex any page on your site. This even applies to duplicate pages that might be getting discounted. Remember that any page can potentially obtain external links. If a 3rd party were to link to a page that is disallowed/noindexed, then you won't get any SEO benefit from that link. Additionally, if there are internal links to those pages, then you will be wasting your site's PageRank.

Home | Services | SEO Basics | Quote Request | Ask your Questions | Step-by-Step SEO Guide

Offering search engine optimization in the Los Angeles Area - Call us at 805.558.3542 for a free consultation or request a free quote.

© 2015 SEO Cipher. All Rights Reserved | Privacy Statement