Three Scary SEO Mistakes to Watch out For

Posted on October 23, 2018

SEO Mistakes

As Halloween approaches, many people are looking for a good scare. That said, your digital marketing campaign is the last thing you want to get spooked about.

Watch out for these frightening SEO mistakes:

Static URLs

Does your website have “www” preceding the domain? If so, try deleting the “www” from your domain in your browser. If not, try entering a “www” in front of your URL.

In either of these scenarios, you want all of your URLs to revert to one static version. It is not ideal to see the following versions of your website’s URLs:

yourdomain.com
www.yourdomain.com

If you notice that multiple versions of your website’s URLs exist, you should be frightened!

Why? Because Google’s bots essentially see two unique URLs displaying the exact same content. This could lead to a duplicate content penalty, ensuring that your website will practically be invisible in Google.

If that’s not frightening enough, fixing your website’s URLs will not automatically make your website appear in Google searches overnight – it could potentially take months to fully recover from a duplicate content penalty!

Be sure to check all of the following variations of your website’s URLs:

With and without “www”

yourdomain.com
www.yourdomain.com

HTTP and HTTPS:

http://yourdomain.com
https://yourdomain.com

With and without a trailing slash:

yourdomain.com
yourdomain.com/

Use 301 redirects to revert all versions of your website’s URLs to one static version if you are noticing that your website has several URL variations displaying the same content.

Image File Names

Redesigning your website can be a great opportunity to improve your web visitor’s navigation experience and boost your SEO performance.

That said, there are several hundred factors the Google bots use to determine how visibile your website is in organic search. More likely than not, unless your web developer is working in tandem with an SEO specialist, some intricacies which helped your SEO performance before may be lost during your website redesign.

Common SEO Mistakes

One such example is website images – the file names specifically.

If you notice that your image file names have been renamed to something generic such as:

img614536.jpg

This should frighten you. Why? Google looks for keywords in your image file name, and if all of your image files have been renamed such that they no longer include keywords, you could potentially lose a significant amount of SEO authority.

That said, watch out for image file names that have too many keywords. Let’s say you own a hardware store – you may find the following to be a great file name for your one of your website images:

top-hardware-store-best-screwdrivers-cheap-power-tools.jpg

This image file name is likely to yield as much SEO value as the previous generic example. This file name is too long and unnaturally stuffed with keywords. The Google bots will never reward websites with image file names like these.

You should aim for short image file names with one keyword topic. For example:

chicago-hardware-store.jpg

This file name is short and naturally optimized with one keyword topic. Files names like these will help boost your website’s SEO performance.

Robots.txt Usage

Most websites have a “robots.txt file.” You can view your robots.txt file by searching yourdomain.com/robots.txt in your browser.

You should be alarmed if your website does not have a robots.txt file in place. The Google bots use this file to determine what and what not to crawl on your website. Having one in place is an absolute SEO best practice!

That said, a robots.txt file can easily be misused.

Upon viewing your website’s robots.txt file, you never want to see this:

User-agent: *
Disallow: /

The “Disallow: /” text is among the most terrifying SEO mistakes you can find. Why? Because Google’s bots are instructed to ‘disallow’ the entire website.

In other words, “Disallow: /” means your website will never appear in Google’s organic search results. As long as this string of text is visible in your robots.txt file, nobody will ever be able to find your website in Google.

Fortunately, this should be a relatively easy fix either by accessing your robots.txt file and simply deleting the “Disallow: /” text, or by un-checking the “discourage search engines from indexing this site” option in your website’s backend.

It is easy to make a mistake on your website with lasting SEO damage, and it takes a specialized professional to not only prevent these mistakes, but also fix them.

Contact us for a website diagnosis today and learn more about the roadblocks standing between you and search engine traffic.