Technical SEO

The Ultimate Guide to SEO

Ultimate SEO Guide Menu

Written by Harry

15th October 2020

Technical SEO is where things can get slightly more complicated, to a degree, technical SEO should be done on all websites. You should optimise the website’s speed and performance to the best of its ability in an effort to make it rank highly. There are other parts of technical SEO that I rely on when things aren’t going so well. I usually find this to be the case with new clients, as well as clients that have recently had their website redesigned by somebody that didn’t stick to the SEO best practices when building. In this section you should find that when all other above boxes are ticked, the issue may lie in the content from this point on.

Website Speed


Website speed is very important for SEO. Google won’t want to deliver a search result to a user if that user has to wait 10 seconds for it to load. You may be thinking that 10 seconds doesn’t sound like long, but the effect a page load can have on bounce rate and revenue is crazy! Here’s some statistic to put it in perspective:
  • 9.6% of visitors bounce when the page speed is 2 seconds.
  • 32.3% of visitors bounce when the page speed is 7 seconds.
  • Users visit an average of 5.6 more pages when a page load time is 2 seconds compared to 8 seconds.
  • + 3% conversions for every second from 15 seconds to 7 seconds
  • + 2% conversions for every second reduced from 7 seconds to 5 seconds.
  • + 1% conversions for every second reduced from 4 seconds to 2 seconds.
That’s around 30% more conversions to bring your website down to a good loading speed.Bounce Rate – When a user visits your site but leaves shortly afterwards before navigating or interacting with the pages.To optimise the website’s load time there are numerous techniques you can try:
Not displaying

Optimising Images

To optimise images I would start by ensuring you have the right format. These are JPEG 2000, JPEG XR or WebP. These often provide better compression!Once you have the correct format, cut the image down to the right size. An image should not be larger than it needs to be. For example, If you were to place an image with a max width of 300px and a max height of 300px – then your image doesn’t need to have dimensions any larger than 300 x 300.Once you have set your correct image size you should now try to further compress the image. There are many tools online that offer image compression for free. Try run your image through one of them and hopefully knock a few kilobytes(kb) off of it!

Minification

Imagine
If
I
Wrote
My
Blog
Like
This

Reading it in a column like that probably took you a bit longer than if I wrote ‘Imagine if I wrote my blog like this’. So what’s my point?

Well when we code, it’s usually best practice to organise our code using ‘indents’. Here’s an example:
<div class=”unordered-list”>
<ul>
<li>List Point 1</li>
<li>List Point 2</li>
<li>List Point 3</li>
<li>List Point 4</li>
</ul>
</div>

This code above details a list that will display on viewers browsers like so:

  • List Point 1
  • List Point 2
  • List Point 3
  • List Point 4

The reason it is best practice to organise code like this on a website is because it keeps it clean so coders can keep track of where they are, or easily go back to find certain elements they may need to edit. We can easily see that the bullet point list (called an unordered list) is contained in a tag with a class of “unordered-list” and with 4 bullet points contain inside it to be displayed. The same list would be displayed if you wrote the list in html like so:

<div class=”unordered-list”><ul><li>List Point 1</li><li>List Point 2</li><li>List Point 3</li><li>List Point 4</li></ul</div>

This is much harder to read, so coders don’t tend to do this!

The problem with this is, a web browser reads the code more like a book. Starting from left to right, and working itself down every line until it has fully crawled the page and displayed the content. Similar to us, a web browser is slower when reading lines of code displaying downwards like my example. Our solution to this is minification.

Minification is where we will take code like this:

And display it like this:

Just like my example. A browser will be able to read this code much faster now that it is all on one line, rather than spread out over 8 lines.

How to minify code

Minification is a simple process, you can either try it manually, or automatically (Using a plugin).

Note: I’d advise taking a back-up of any files you’re planning on minifying. Minification can sometimes break a website depending on how it’s been coded (especially with Javascript).

The Manual Way

To manually minify a website simply take the code of the pages and run it through a minification site. I would personally use Minify Code. Copy and paste the end product back into the relevant file and reupload it.

The Automatic Way

The automatic way would be to use a plugin (I only know those that are on WordPress). A simple Google to find the right plugin for you could work. I’ve used Autoptimize and never had an issue, so it may be worth checking that tool out!

Server Speed

If you have set up a website and worked through my article, hopefully you chose the right hosting provider to offer you the fastest server speed. If not, no worries. I’ll explain how you can test server speed in more detail and uncover whether it is a server issue, or an on-site issue.My first favourite tool to use for this would be Pingdom’s Website Speed Test tool. Run your website through that and then navigate your way to the ‘File Requests Table’. At the top you should see the initial request sent to the website server to receive the page file. Ordinarily, the longest part of this request will be the ‘wait’ section. This is now in the server’s hands to grab the file and send it back to your browser, the faster the server, the shorter the ‘wait’ time will be. Here’s an example of my site as of August 2018.This is just under 4 second load time – which in my eyes isn’t good enough. This is on my list to optimise!The second way to check is by using my other favourite tool. The Lighthouse Audit feature on Google Chrome. To get this to you’ll need to right click your website and bring up the ‘Inspect’ tool. From there, navigate your way to the ‘Audit’ tab. Select what you would like to be tested on the site, for speed tests ensure that ‘Performance’ is selected. If your server speed has been flagged then it will appear under the ‘opportunities’ tab and will display something that looks a bit like this:

Rendering


Rendering is the function of translating code and displaying it as a the visible content that you will see on a website. There are numerous different types of rendering but I want to focus on two important ones for SEO.Bot Rendering: How search engines bot’s think your page looks.*INSERT GOOGLE WEBMASTERS SNIPPET OF GOOGLE BOT RENDER ON YOUR SCREEN*Browser Rendering*INSERT GOOGLE WEBMASTERS SNIPPET OF BROWSER RENDER ON YOUR SCREEN*
Not displaying

Bot Rendering

When Google crawls a website it will attempt to paint a picture of the visuals that a user will see. Providing the site has been set up in a healthy manner and that the bot doesn’t run into any crawling issues, this picture will closely resemble what a browser will see.Finding out how the bots are crawling and rendering sites is handy because it is the crawlers that provide the content of the website for Google to judge and rank accordingly. If these bots were to miss anything, or not display the site correctly, it could feedback bad signals to Google which will take a negative toll on overall rankings. A frustration that many SEO’s would have probably have experienced is when a website that is appearing with no issues on a browser is not displaying properly in a Bot render.There are many reasons as to why a Bot render may display the website differently to that of a browser. I won’t be able to go into all of these issues individually but I would recommend checking the following:Javascript – Out of all search engines Google’s bot’s are one of the best at being able to understand javascript; however, Javascript is a language made to communicate with Browsers. Google Bots are obviously not browsers, so parsing this Javascript can be quite difficult for Google. If Javascript is being used for any aspects of the website design, then you could find that these design elements display differently to a browser’s take on it.CSS – I’ve personally come across a CSS issue where the height of a header was set to 100% with no max-height value set (but it was in a container). This lead to the Googlebot trying to fill it’s viewport with an image which covered the main bulk of page content.This is one example of how the CSS may be mis-interpreted when a bot is crawling your site. These misinterpretations can cause Google to think that there user-experience issues on the website, resulting in drops on the SERP’s.Robots.txt – When a certain coding language is blocking the crawlers from seeing or finding content it may be quite tempting to use the robots.txt directive to block the bot crawling this language. I’m just going to go ahead and bluntly say don’t do it. Blocking bots from certain coding of your website means that they won’t know what your website actually looks like, and therefore, they won’t trust it. This will most likely have an effect on your rankings, and I doubt it would be a good effect!Screaming Frog Crawler – Screaming Frog’s website crawler gives a whole bunch of useful information that can help identify any crawl issues with a site.

Browser Rendering

The key fact to remember is that a webpage is effectively a file with a bunch of code on it, that web browsers stitch together into a visual that creates the webpage that is displayed. These browsers read these files similar to how we read books, from left to right starting from the top, working down through the page, but I’m jumping ahead a bit. I’ll quickly detail how a browser renders a webpage:Firstly, when you open up a webpage you are pinging the server that the domain resolves to (going into this in more detail is a completely different topic). When you’re pinging the server you’re sending a request for that server to provide you with the page of code that the browser needs to read to display the website.This may look familiar, it’s the same snippet using in the ‘Server Speed’ section of this article. As you can see this tool has broken down the way a browser asks the server for the file. The ‘Wait’ time is how long the browser is waiting for a response from the server to provide the necessary file to read to load up the page.Once the server sends back the page of code for the browser to read the browser will then begin parsing the code and constructing the DOM.

Document Object Model (DOM)

As mentioned before a webpage is a file, this file can be displayed as a visual webpage with images, styling and functionality, or in raw code. Either way, it is the same document. The DOM is a programming interface that allows the construction and positioning of certain elements in the page to translate the code into the visuals that you see on your screen.The DOM does this by finding HTML and CSS tags can converts them into Nodes. For clarity, I’ll show you what a DOM looks like the nodes that make it. I’ve put some not so subtle arrows pointing at the nodes on the DOM.When the tags are converted into elements they will be listed below their parent tag.If I haven’t confused you enough, then this might. As the browser is reading through the lines of HTML code it should eventually come across the links that pull in other external resources. Such as the stylesheet containing the CSS. When this link is found, the browser will request that the server provide that linked file and the whole process starts again. This time, the browser will create what is called the CSS Object Model (CSSOM).The CSSOM is similar to the DOM, but different. The DOM groups together the HTML code and then pairs it with the CSSOM to style it and gather an understanding of how it should be rendered. Once the browser has congregated all of the nodes with their corresponding parents and elements it begins constructing the render tree.The render tree is a combination of the DOM and CSSOM. This means the browser has an understanding of where the elements are, the parent elements that they appear under, and their styling (which includes their positioning). The render tree has a similar layout except with all of the nodes combined.From here we move onto the layout stage. This allows the webpage’s elements to appear in their exact position and size within the device’s viewport.Once the relevant computations have been made the ‘Paint’ is carried out. This consists of taking the nodes from the render tree and converting them into pixels to display on the screen.Note: I can go into more detail on how this works but right now, this is enough to know from an SEO point of view.

Rendering Recap


The reason I’ve gone into detail for the Browser and Bot rendering is because having an issue with either one of these can affect SEO. Usually an issue with browser rendering also means the same issue for bots. In a rare circumstance where this is not the case, you should still fix it as soon as possible because an issue with browser rendering can cause an increased bounce rate, or other bad user behavior signals; all of which can affect SEO.

Render Blocking CSS


When calling for the CSS to build the CSSOM, the browser will stop rendering any processed content until it has finished the CSSOM construction. This means that while the CSSOM is being built, the render is being completely blocked, meaning that the website will not appear properly until finished. Sometimes when loading a website (especially with slow internet speeds) you will witness what is called a ‘Flash Of Unstyled Content (FOUC)’. This is when the browser is rendering the html elements but is still yet to construct the CSSOM and sprinkle that into the mix.
Not displaying

Optimising the Critical Rendering Path

Now you have an idea of how a browser works, you’ll understand the amount of jobs that need completing in a very small space of time in order to display a webpage. In a world where our attention doesn’t last longer than 3 seconds (fact) it is important that the render is completed as soon as possible. A long render will also have a negative impact on SEO, so if you don’t optimise for the user’s, at least do it for the rankings!

Measuring the Critical Rendering Path

As with anything in digital marketing, to improve performance, you need to measure. I have two favourited methods to optimise the Critical Rendering Path:Chrome 41
  1. Download Chrome 41 – Google Bot uses the same Web Rendering Service (WRS) as Chrome 41.
  2. Right Click on the web page and click inspect navigate to ‘Network’ and refresh the page – This will flag up any issues retrieving elements on the page.
  3. Do the same with Timeline – This will bring up how the DOM was formed and give details into the render path of the page.
Chrome 41 is handy because any rendering issue’s it detects in the browser will most likely resonate with Google Bot.Lighthouse AuditIn new updated versions of Chrome under the Audit tab there’s a Lighthouse integration where you can run an audit on the website. This audit can display many different metrics, but to ensure that you’re measuring rendering have the tickbox ‘performance’ ticked.When the audit is complete you will have a screen that looks a little like this:Here you should be able to see anything flagged that effects the website’s load or render time. The changes you make can be instantly re-tested which is always refreshing to hear in the SEO world.

Common Issue’s that effect render times

Lazy Loading

Lazy loading is a technique that defers the load of non-critical resources when the webpage is loading. These non-critical resources are usually images that appear below the fold on a webpage. This is beneficial because without lazy load, if you were to load an image heavy home page of a website, but instantly use the homepage to navigate further to an inner page, the browser would load up a whole bunch of images that you never even looked at. This uses both bandwidth, increases the browser load time, plus uses up additional battery, data and other system resources. The solution is to enable Lazy loading for Off-Screen images. This means that the image won’t begin loading in until the bottom of your browser is a set amount of pixels away from the image.To lazy load images on WordPress you can either use a plugin. Or, the manual way.

Defer Unused CSS

A browser has to find and read all external CSS stylesheets before it can display any content. If a browser displayed the HTML before understanding the CSS then you would find that almost all websites will display a FOUC before snapping into place!The CSS needed to display the content of the page is called ‘Critical CSS’. The optimal approach to deliver critical CSS is to inline it into theof the HTML. You can do this by setting < style > tags and then adding your relevant CSS within this. You can also look at splitting up your stylesheets into different files organised by media queries. You can then add a media attribute to each stylesheet link. When loading a page, the browser will only block the first paint to retrieve the stylesheets that match the users device.Uncritical CSS is the CSS that the page might need later. For example, clicking a button that causes a modal to appear. The CSS for this is uncritical because the button can’t be clicked until the initial render is complete, therefore the modal will never be displayed when the page is first loaded.Detecting Critical CSS – Bring up the Inspect tool on Google Chrome and bring up the ‘Coverage’ tracker. If you can’t find it, use the option menu at the bottom to bring it up.Click the loop looking icon to run a refresh and log the requests.You will see metrics at the bottom that look like this:This will tell you how much of the information is used and unused.

Render Blocking Scripts

Just like CSS if JavaScript is needed to render certain resources on the page then it too can block the render. Google Recommends the following to optimise the render time:
  • Consider inlining critical scripts in your HTML.
  • For non-critical scripts consider marketing them with async the defer or defer attributes.
A big thank you to https://developers.google.com for helping me put this content into simpler terms.

HTTP/2


HTTP means Hyper Text Transfer Protocol. It is the protocol that directs the actions webservers and browsers should take in response to certain commands. When you enter a URL into your browser, this sends a command to the website’s web server directing it to provide the web file for the browser to read and display the page.Nowadays the web standard is a transfer protocol called HTTP/2, which outdates its HTTP/1.1 predecessor. HTTP/2 is designed to make sites faster, simpler and more vigorous. The best news is that HTTP/2 is backwards compatible with HTTP/1.1.How to switch to HTTP/2The switch has to be made server-side, if your server doesn’t currently load the website over HTTP/2 then I’d advise transferring your website away to another provider that can offer this, or get in touch with your current host to discuss if the switch is going to be made.

Status Codes


Whenever you load up a webpage the HTTP request will send back a response code. This response code gives insight on the status of the response from the server. These are listed below:
Not displaying

2XX’s

A 2XX Status means that the load has gone through with no issues or re-directs.200 – Standard response for successful HTTP requests. The actual response will depend on the request method used. In a GET request, the response will contain an entity corresponding to the requested resource. In a POST request, the response will contain an entity describing or containing the result of the action.201 – The request has been fulfilled, resulting in the creation of a new resource.[10]202 – The request has been accepted for processing, but the processing has not been completed. The request might or might not be eventually acted upon, and may be disallowed when processing occurs.203 – The server is a transforming proxy (e.g. a web accelerator) that received a 200 OK from its origin, but is returning a modified version of the origin’s response.204 – The server successfully processed the request and is not returning any content.205 – The server successfully processed the request, but is not returning any content. Unlike a 204 response, this response requires that the requester reset the document view.206 – The server is delivering only part of the resource due to a range header sent by the client. The range header is used by HTTP clients to enable resuming of interrupted downloads, or split a download into multiple simultaneous streams.207 – The message body that follows is by default an XML message and can contain a number of separate response codes, depending on how many sub-requests were made.208 – The members of a DAV binding have already been enumerated in a preceding part of the (multistatus) response, and are not being included again.226 – The server has fulfilled a request for the resource, and the response is a representation of the result of one or more instance-manipulations applied to the current instance(source: https://en.wikipedia.org/wiki/List_of_HTTP_status_codes)

3XX’s

Whenever a response code begins with a 3 it usually indicates a redirection. The main two are:301 – A 301 redirect indicates that a webpage has been permanently moved to a new URL. This will send any user trying to access the redirected page to the new page and also indicate to Google to transfer any SEO equity over to the new page. Massively helping with rankings.302 – A 302 used to be a temporary redirect (what is now a 307). Now a 302 is page that will tell the client to browse to another URL.307 – A 307 redirect is similar to a 301 redirect. It will redirect a user over to the new page url when typing in the old URL, but it will not transfer any SEO equity. These are usually for temporary redirects so the SEO equity stays with the original URL.

4XX’s

4XX errors codes usually indicate an error with loading the page.

400 – This is normally due to a bad syntax or inclusion of characters that the sever can’t understand.

401 – Unauthorised, similar to a 403 except the response gives you a authenticate header allowing you to authenticate to access the page.
402 – The error response is usually ‘payment required’ but this hasn’t been used in a while.
403 – This means the page is forbidden. The resource may be restricted for access using http protocols.
404 – This means the page can’t be found. Either it has been removed or the URL has been typed incorrectly.

5XX’s

5XX errors usually indicate an issue with the server. The server will ping back a 500 error when it is aware that it incapable of carrying out the command.500 – Means internal server error, where the server can’t carry out the client’s request.502 – The server responds with ’502 – Bad Gateway’ when it receives an invalid response whilst attempting to process the request.503 – Service unavailable, meaning that the server is unable to handle the request due to a temporary overload or server maintenance. After some time if the issue hasn’t resolved, then treat it as a 500 error.504 – The Gateway has timed out. While the sever is acting as a gateway it did not receive a timely response from the requested URL. This could be down to a DNS issue.

.htaccess file


The .htaccess file acts as a directory for browsers and bots to follow when loading or crawling a website. The directory sits on the server in the root file. Developers can edit this file to perform a large variety of different functions.
Not displaying

Apache Redirects

I touched earlier in this article on the ability to setup forwarding to https using the .htaccess file. I personally also use the .htaccess file to set up redirects.This can be done by simplying inputting the following into an area of the .htaccess file:Redirect 301 /slug-of-url-to-redirect https://www.example.com/full-address-to-redirect-toSo you simply put the path after the end of the domain in the beginning of the redirect followed by the full address where you want to redirect to.For a temporary redirect simply change ‘301’ to ‘302’.

GZIP Compression

GZIP compresses your webpages and stylesheets before sending them over to a browser. This reduces the transfer time allowing the page to load and render much faster. It works by locating similar strings in a text file and replacing them to make the file size smaller. This works well because CSS and HTML files use a lot of repeated text (such as the tags which make up the nodes in the DOM), and also have a lot of white space. GZIP compresses these common strings which can drastically reduce the size of pages.Enabling GZIPTo enable GZIP it depends on the sort of server your website is running on. I am not a server master, but I find that a lot of my clients run on an Apache Server, Linux or IIS. The following GZIP compression added to the .htaccess file seems to work just fine for them:Testing if GZIP is presentNumerous online ‘SEO Checkers’ will probably flag up if GZIP compression is present on the site. If that fails, then simply type in ‘GZIP Compression Checker’ on Google and you’ll probably find loads of results to put your site in to be checked.

What else?

There are many other things that you can with a .htaccess file such as blocking access, custom 404 pages and much more. The two listed above are two that I personally use for SEO reasons.

Schema Markup


Schema Markup (or just Schema) is a markup language that makes it easier for search engines to understand content and can directly contribute to how a result can appear in the SERPs. It is commonly referred to as ‘structured data’ because that’s exactly what it does, structures data. Let me show you a quick example:Lost and Founder is an awesome book by Rand Fishkin (of whom I am a big fan of! – could you tell?) When conducting a Google search on this book I am greeted by the following results:
Not displaying

View example

Check out the last one with the star rating by it. Google has to pull that star rating somehow, so lets take a look at how it’s done that.
Not displaying

View example

Clearly, Google is pulling the review rating from here, so there’s a good chance there’s some schema behind this!And here it is:
Not displaying

View example

The highlighted code is telling search engines that this is a star rating and is one of the many types of Schema that helps search engines understand page content and potentially display it on the SERPs. Obviously there’s more functions for Schema, which can be found elsewhere on the website’s source code:
Not displaying

View example

In this case we can see that he Schema is also helping search engines pick up the author name, the fact that the page is referring to a book and the book’s name. Handy stuff!The Schema website provides a good example of how Schema helps search engines. They use the example ‘Avatar’. When a website contains text referring to the word ‘Avatar’ in the form of title tags and page content. Search engines can find it difficult to work out whether the page is discussing the movie, or a form of profile picture. In this case, the Schema mark-up helps explain to search engines that the article is actually about the movie. See full example here: https://schema.org/docs/gs.htmlWhen playing around with Schema I use Google’s Structured Data Markup Helper. This can render a page for you allowing you to physically highlight the content you want to mark up and automatically generate the code for you to overwrite on your existing website files.

0 Comments

Submit a Comment