Search Engine Optimization Myths Debunked

When you read a lot of the SEO information that is out there, inevitably you will come across an article that will make it seem as though the article topic will help with search engine rankings even though it will not. Many articles are simply wrong or out-of-date, while others cover best practices that make for good articles but won't significantly change rankings. We have created this page to help you identify some of these uninformative or unimportant articles that you might come across during your research. Please feel free to contact us if you have another myth to add to this list or if you are unsure about an article that you have read.

Myth #1 – Duplicate content will get you penalized

Of all the search engine optimization issues out there, duplicate content may be one of the most misconstrued. You'll often hear that duplicate content will hurt your rankings and possibly get you penalized. It is true that having duplicate content could make you rank worse than you would if you had unique content, but it's not really a black and white issue.

Whenever Google finds pages with the same content, Google attempts to determine the true source or "authority" of the content. The content's authority is probably decided by several factors, but primarily it will be based on domain age/trust and PageRank. The page that has been deemed the true source will get the SEO benefit of the content and most of the "duplicate" pages will be put into Google's supplemental index.

In the end, the worst case scenario that can occur from having duplicate content is that you get zero benefit from having it. You will not be penalized for the content except in extreme circumstances. For example, if a vast majority of your site consists of duplicate content, then your entire site might get discounted. Whether or not that's truly a penalty or a mass filter is really just semantics.

It is theoretically possible for you to become the authority of duplicate content by link building and increasing your PageRank. However, it's usually much less work to simply rewrite the content. If you want to get benefit from your duplicate pages, then we suggest writing unique content for them.

Myth #2 – W3C compliance helps with SEO

We do not feel that validating your pages according to World Wide Web Consortium (W3C) standards will improve rankings. Granted, we cannot say this with 100% certainty since we do not work at any of the search engines. However, it does not make sense that Google or any of the other engines would give a significant ranking preference to valid pages. The engines are much more likely to consider on-page factors, link importance and link relevance.

Please note that our discussion of this search engine optimization myth only applies to standard Web searches done by users at their home or work computers. It is possible that mobile Google searches could be skewed to include mobile friendly pages, which would include valid XHTML documents. Additionally, we do feel that certain compliance issues can have an effect on search engine rankings. For example, using <br> instead of <br /> on a page wouldn't affect rankings, but having multiple <head> and <body> elements could hurt rankings by preventing some of your content from being indexed.

Myth #3 – there is an ideal keyword density

Many SEO tools out there claim to be able to determine what keyword density you should use on your landing page to target a particular phrase. Their general methodology is to look at the pages that are currently ranking well for the phrase and averaging their densities. Unfortunately, these tools fail to account for off-page SEO factors. Considering that a top ranking page might be ranking almost completely due to off-page factors, such an analysis is gimicky at best. You should maintain a minimum keyword density on your pages (2% is more than enough in most instances), but there is no particular percentage that you should target.

Myth #4 – JavaScript links cannot be crawled

Unfortunately for those of us who like to prevent PageRank dilution, Google now goes out of its way to spider JavaScript-based links and pass PR through them. If you are attempting to implement PR sculpting techniques, you will have to format links that you don't want crawled in a much more complicated way. For example, in the past you could have formatted your links as follows and Google might not have passed PR:

<a href="javascript: go('contact.html')">Contact us</a>

Nowadays, this link would definitely be counted by Google and needs to be hidden in a more complex way. If you would like to format certain links in a way that will not be counted for PageRank, you will have to put them in <iframe> elements or generate them with JavaScript that is defined off the page. No matter which method you choose, you should disallow the <iframe> source file or .js file in your robots.txt file.

Myth #5 – XML sitemaps will improve rankings

We recently discussed XML sitemaps in our blog, but we wanted to mention it here as well. Should you create an XML sitemap? Many people would say yes. However, we do not feel they are worth the effort because they will not improve your site's rankings. An XML sitemap might help get your page indexed, but it won't increase relevancy for any keyword and it will not pass PageRank to your page.

Myth #6 – 404 pages are important for SEO

Don't get us wrong. Having a 404 page to account for non-existent pages is an important usability issue. However, creating a 404 page will not help with rankings. It will direct users back to your site, but the search engines will stop once they receive the 404 status code whether or not there is content on the page. PageRank and linking benefits will also dead end regardless. From a search engine optimization standpoint, it would be best to 301 redirect non-existent pages to your home page. Unfortunately, that could be somewhat confusing for your visitors. We recommend 301 redirecting any non-existent pages that have external links (as described in our external links page), but all other non-existent URLs should serve your custom 404 page.

Home | Services | SEO Basics | Quote Request | Ask your Questions | Step-by-Step SEO Guide

Offering search engine optimization in the Los Angeles Area - Call us at 805.558.3542 for a free consultation or request a free quote.

© 2015 SEO Cipher. All Rights Reserved | Privacy Statement