There is a fine balance in SEO between focusing on user experience at the same time as search engines. Once a sweet spot is found, the balancing act has to be ongoing because both are interdependent of each other. Users understand that a great deal goes on behind the scenes of a website to make it amazing, but a certain amount of respect must be earned before they are willing to engage with it. Broken links, malware, slow loading pages and unresponsive designs are just a fraction of issues that can badly affect the usability and reputation of a website. The same can also be said for brick and mortar stores. If a store is untidy and hazardous or the staff are rude and ignorant there is very little chance that a customer will buy anything or go back to that place. One of Google’s own philosophies is to “Focus on the user and all else will follow.” For webmasters they advise a similar ethos by favoring websites that provide engaging, as opposed to thin, content.
Here is a hypothetical scenario. With everything above in mind, a web developer goes away and comes up with the most innovative, jaw dropping, responsive website that has ever existed. It attracts heaps of links, social shares and citations across the web and has even won awards for being so stupidly great. The website owners search for themselves in Google and are shocked to see that they don’t even rank for their own URL. After a closer inspection it is found that search engine bots are being blocked from crawling the website, there 40 products, it takes 15 seconds to load and it’s hosted in North Korea. For the sake of this example, that is actually possible. It is unbelievable that such simple issues can completely hold back a website. Without a basic understanding of SEO there is no way that a person would know what is wrong.
The problems above can be diagnosed and fixed in the following ways:
Firstly, install Google Webmaster Tools and Bing Webmaster. This is very important unless you have the time and patience to trawl through access logs. Although using Splunk to view access logs makes life easier.
Check robots.txt
Search for the following www.yourdomain.com/robots.txt to see what is blocking search engines from crawling your website. The most common mistake webmasters make is this:
User-agent: *
Disallow: /
This will stop all robots from crawling your website, although Google will list your website in the search results, it won’t show any content. To fix this problem change the robots.txt or add a new one like this:
User-agent: *
Disallow:
Check Meta NoIndex
Unlike blocking with robots.txt the html meta tag, noindex stops your site from being indexed completely. This can be useful when applied to certain pages that you don’t want to be indexed at all. It usually appears at in the header of a page and looks like this:
<META NAME=”robots” CONTENT=”noindex, nofollow”>
If the website title is not even showing in the search results then this could be why.
Check for sitewide rel=canonical
In 2009 Google, Yahoo! and Microsoft acknowledged the use of rel=canonical. This tag gives webmasters the option to remedy duplicate content issues by stating which version of a page is the most important. This in turn signals to a search engine that it should disregard the lower priority page in favor of the most important one. However if the tag is inserted into a global header it can cause a major problem. All of the pages on a website will be regarded as duplicates of whatever page is in the tag. This is how it looks in the of a source code:
<link rel=”canonical” href=”http://www.example.com/about/” />
Google only takes the preference into account and should not be substituted for permanent redirects. 301 redirects affect both the search engine bot and user experience in that a URL automatically switches to a new one, specified by the webmaster. They also pass some strength from one page to another which is another reason why they should be a first option. Problems arise when these redirects become chained. This occurs when a page goes from A to B to C to D to E at once. For the user this could go from A to E without any indication of the URLs in between. However this could cause search engine bots to give up crawling which stops your content from being indexed. For more information about duplicate content and canonicalisation take a look at this post.
Check for faceted navigation problems
A common issue that arises a lot with larger websites has to do with faceted navigation. This is where users are able to filter content based on facets such as color, size, price, language. It mostly occurs in eCommerce websites but can also affect other sites where each of these parameters can be changed to serve different content to a user. In some cases this can waste a search engine bot’s time and will cause it to leave your website. Every search engine bot has a budget for each website and it crawls them based on factors related to the underlying strength of that site. Once that budget is depleted it finishes and moves onto another website. If bots are being sent on wild goose chases because of hundreds of irrelevant variations of one item or product then the other, more important pages, are missing out. For example, if a page can be ordered alphabetically there is no need to index it twice because it is the same content in reverse.
This can be fixed in the Configuration > URL Parameters section of Google Webmaster Tools and Index > URL Normalization in Bing Webmaster. Here you can find a video of how to configure them for Google.
Check page load time
PageSpeed Insights by Google is an extension that allows you to test the speed of a website. It gives you a score out of 100 and pointers on how to improve your score. In 2010, Google incorporated page load time into its ranking signals which helps both usability and visibility in Google.
Check the current hosting provider
Another reason why a website is running slow could be due to the hosting provider. Search for the domain name in Netcraft to see where in the world it is hosted and which other websites on or the server. Hosting providers with full servers can slow a website down by making it queue up to serve content to a user. Servers that go down a lot also have a serious impact on rankings and usability. Signing up an account with Pingdom allows you to set up regular checks which notify you about the health of a server as an when an event occurs. If you run a busy blog website it would be important to know if it goes down. Pingdom can alert you by SMS if something does happen.
With these bare bones laid out, you can be search engine friendly and focus on creating great content for your users.
Nice post. Sometimes those little things can be overlooked. I had one customer who had a website up for a year (prior to be a customer) and I found that the noindex nofollow box had been clicked in the wordpress back end.
Hmm it looks like your blog ate my first comment (it was extremely long) so I guess I’ll just sum it up what I had written and say, I’m thoroughly enjoying
your blog. I as well am an aspiring blog writer but I’m still new to everything. Do you have any recommendations for inexperienced blog writers? I’d genuinely
appreciate it.