Follow me on LinkedInFollow me on TwitterFollow me on FacebookFollow me on Facebook
Categories
Wher to start with technical SEO

Where to start with technical SEO

Article

5 years ago

5 years ago

Share

SEO is a complex beast, divided across three main areas of content, technical and off-site. Content and keyword research are often the easiest of the three to implement successfully, with technical SEO often left in the ‘too hard’ basket. Before we move on, check where you’re at against our list of SEO best practices.

The technical side of SEO can seem daunting, but a few tweaks in the right area can make all the difference to the visibility of your content. It’s not enough to write great content, search engines need to be able to easily find it. Here we break down some simple steps to stay ahead of technical SEO.

It’s not enough to write great content, search engines need to be able to easily find it.

1. Check your speed

A topic that you’ll never hear the end of – the page load speed of your site can have a huge impact on whether or not search engines will see your content. It’s also a confirmed ranking factor in the Google algorithm. Use the Google PageSpeed Insights tool to see how your site compares and what you can do to improve it.

2. Get to know robots directives

Accidental blocking of crawlers through misuse of robots directives is a common issue that is often simple to fix. Look out for:

  • Nofollow and noindex robots tags – These html tags can be added to specific pages to either tell search engines not to include them in search results (noindex) or not to follow links to the page (nofollow). These directives are generally used on larger sites to limit time spent on pages that are not beneficial to the reader, but can also accidentally cause the removal of vital pages from the index if used incorrectly.
  • X-Robots tag – Similar to the standard tags above, the x-robots tag works on the same principle but can be used with greater flexibility to control a page as a whole or through specific elements.
  • Robots.txt file – Predominately used to tell Google when not to crawl a page, the tags and implementation are similar but are implemented at a folder level as opposed to a page level. Go to www.yoursite.com/robots.txt to review yours and use the Robots.txt tester tool in your Search Console (only available in the old version).

For more on this topic, you can find great resources on which to use and how to implement the right tags over at Yoast and Moz.

3. Understand your response codes

It’s important to not only find the page errors on your site but also understand what these errors mean in order to fix them correctly. Use your search console to uncover crucial site errors that could be affecting your ranking potential. Some of the most common ones include:

  • 4xx: Client errors – e.g. 404 – The page is not found and the error is on the website’s side. Often due to content being moved or a URL being updated without the correct redirect being implemented.
  • 5xx – e.g. 503 – service unavailable – This can be caused by too many people trying to access a site or maintenance. This means there is a server issue which is preventing access to the site and needs emergency review.
  • 3xx – e.g 301 – the page has been redirected permanently. This is the correct redirect that should be used in the majority of cases as opposed to a 302, which suggests the page has only been moved temporarily and won’t pass the link authority to the new page. It’s also good to check whether the crawlers are being sent through multiple redirects before reaching their final destination, as this will have a negative impact on ranking. Try using a simple tool like Redirect Path to check this.

4. Don’t be browser blind

Chrome is currently the most popular desktop browser, owning close to 80% of the market. Safari also remains huge thanks to the iPhone dominance in key markets. But what of other often forgotten browsers like Firefox or Internet Explorer? It’s easy to assume they’re less important due to their smaller percentage of users, but failing to ensure your content renders correctly in ALL browsers is a quick way to lose vital customers. Ensure you check your site across all browsers and device types to optimise your traffic potential.

5. Crawl crawl crawl

Many of the above checks can be quickly and easily undertaken using a crawler tool. There are a range of options, from the relatively inexpensive Screaming Frog to more in-depth tools that are great for larger sites such as Deepcrawl. Or you can even trial some of the all-in-one SEO tools out there to give a quick overview of your technical SEO.

Related articles

Related Articles

Subscribe

News and analysis for Financial Marketers

Visit
The Dubs agency

When we're not lovingly crafting finance content, The Dubs Agency works with global clients delivering award winning projects. Check us out to find out more

thedubs.com

The financialmarketer is the publishing arm for the dubs

The Dubs is the content marketing agency for the finance sector globally.