Complete Guide to Technical SEO in 2021

Name three things you did this year that pertain to search engine optimization (SEO). Are these tactics based on keywords, meta descriptions, and backlinks?

If so, you are not the only one. When it comes to SEO, these techniques are the first ones usually added to marketers’ arsenal. Although these strategies do improve the visibility of your site in organic searches, they are not the only ones that you should implement, as there is another group of tactics within the vast world of SEO.

website, design, code
Technical SEO

Technical SEO refers to the behind-the-scenes elements that boost your organic growth engines, such as site architecture, mobile optimization, and page speed. These aspects of SEO may not be the most attractive, but they are incredibly important.

The first step to improving your technical SEO is knowing your diagnosis when conducting an audit of your site. The next step is to create a plan that targets the areas where you limp the most. We will cover these topics in depth in what follows.

What is the technical SEO?

Technical SEO refers to everything you do to make your site easier for search engines to index and crawl. Technical SEO, content strategy, and link building strategies work together to help your pages rank higher in search.

Differences between technical SEO, SEO on-page and SEO off-page

SEO on page

On-page SEO includes content that tells search engines and users what your page is about, including image alt text, keyword usage, meta descriptions, H1 tags, URL name, and internal linking. You have almost all control over on-page SEO because, well, everything is on your site.

SEO off page

SEO off-page tells search engines how popular and useful your page is, thanks to votes of confidence (especially for backlinks or links from other sites that lead to yours). The quantity and quality of backlinks increase the PageRank of a page. All other things being equal, a page with 100 relevant links from trusted sites will outperform another with 50 relevant links from trusted sites (or 100 irrelevant links from trusted sites).

Technical SEO

You can also be in control of technical SEO, but it’s a bit more difficult to master because it’s less intuitive.

What is technical SEO for?

You may be tempted to ignore this SEO component; however, it plays an important role in your organic traffic. Your content may be the most thorough, useful, and best-written, but very few people will see it unless a search engine can crawl it.

It is like a tree falling in the forest when no one is around to hear it: does it make a sound? Without a solid foundation of technical SEO, your content won’t make noise to any search engine.

Let’s talk about how you can make your content resonate all over the internet.

Technical SEO Audit Fundamentals

There are a few fundamentals you need to understand before you begin your technical SEO audit. Let’s explain to them before we move on to the rest of your website audit.

Audit your preferred domain

web design, domain, web

Your domain is the URL that people type to get to your site, like hubspot.com. Your website domain influences whether people find you through a search and provides a consistent way to identify your site.

When you select a domain, you tell search engines whether you prefer the version with or without www to show up in the results. For example, you can choose www.yoursite.com over yoursite.com. This tells search engines to prioritize the version with www and redirects all users to that URL. Otherwise, the search engines will treat the two versions as if they were two separate sites, resulting in a sparse SEO value.

Previously, Google would ask you to identify your preferred version of your URL. Now, Google will identify and select a version for you to show to search engines. However, if you prefer a version of your domain, you can do it through canonical tags (which we will talk about soon). Either way, once you set your preferred domain, make sure all variants (ie www, no www, http and index.html are permanently redirected to that version.

Implement SSL

You may have heard this term before because it is so important. SSL, or secure sockets layer, creates a layer of protection between your web server (the software responsible for fulfilling an online request) and a browser to make your site secure. When a user sends information to your sites, such as payment or contact details, they have less chance of being hacked thanks to the SSL that protects it. An SSL certificate is displayed on domains that start with “https: //” with a lock symbol in the address bar, as opposed to “http: //”.

https, web page, internet

Search engines prioritize safe sites. In fact, Google announced since 2014 that SSL would be considered a rating factor . For precisely this reason, make sure that you have set the SSL variant of your home page as your preferred domain.

After you establish SSL you will need to migrate any non-SSL page from http to https. It’s too much to ask, but it’s worth the effort to improve your ranking. These are the steps you should take:

  • Redirect all pages from http://yourwebsite.com to https://yourwebsite.com .
  • Update all canonical and hreflang tags.
  • Update the URLs in your sitemap (located at yourwebsite.com/sitemap.xml ) and your robot.txt (located at yourwebsite.com/robots.txt ).
  • Set up a new instance of Google Search Console and Bing Webmaster Tools for your HTTPS site and trace it to make sure all traffic has migrated.

Optimize page speed

Do you know how long a visitor will wait for your website to load? Six seconds, and that’s if it feels generous. Some data shows that the bounce rate increases by 90% when the load time of a page is lengthened from one to five seconds. You don’t have a second to waste, so improving your site’s load time should be a priority.

The speed of the site is not only important for user experience and conversion, is also a factor in ranking.

Use these tips to improve the average load time of your pages:

superhero, girl, speed
  • Compress all your files. Compression reduces the size of your images, as does CSS, HTML, and JavaScript files so they take up less space and load faster.
  • Audit redirects regularly. A 301 redirect takes a few seconds to process. Multiply it by several pages or layers of redirects and you will find a major impact on the speed of your site.
  • Sort your code. Messy code can negatively impact the speed of your site. Also, messy code is evidence of laziness. It’s like writing: maybe in the first draft, you get to the point in six sentences. In the second, you make it to the third. The more efficient the code, the faster the page will load (generally). Once you clean up a bit, you can shrink and compress your code
  • Consider a content delivery network (CDN). CDNs (for its acronym in English) are distributed web servers that store copies of your website in different geographic locations and that serve your site according to the location of the search engine. Because information between servers has to travel a shorter distance, your site loads fast for whoever asks for it.
  • Try not to get excited about the plugins. Outdated plugins often display security vulnerabilities that make your website susceptible to malicious hackers that can damage your ranking. Make sure you always use the most up-to-date versions of plugins and minimize their use to the essentials. Consider using custom themes, as the default ones are often loaded with unnecessary code.
  • Take advantage of cache plugins. Cache plugins store a static version of your site to send to returning users, decreasing site loading time during constant visits.
  • Use asynchronous loading (async). Scripts are instructions that servers need to read before processing the HTML, or body, of your web page; for example, the things that visitors want to see on your site. Normally, scripts are located where it says of a website (think of the script from your Google Tag Manager), where they are prioritized over the content of the rest of the page. Using an async code means that the server can process the HTML and the script simultaneously, thus reducing lag and improving page load time.
  • This is what an async script looks like: <script async src = “script.js”> </script> 

Find out where your site fails in the speed department with the help of this Google tool .

Once you have your technical SEO basics in place, everything is ready to go to the next stage: crawling.

Tracking

Tracking is the foundation of your technical SEO strategy. Search bots will crawl your pages to collect information about your site.

If these bots are blocked in any way, they will not be able to index or rate your pages. The first step in implementing technical SEO is ensuring that your important pages are accessible and easy to navigate.

Below we will discuss some items that you can add to your list, as well as some aspects that you can audit about your website to ensure that your pages are crawl-ready.

How to improve the crawling of your website?

  1. Create an XML sitemap.
  2. Maximize your tracking time.
  3. Optimize the architecture of your site.
  4. Establish a URL structure.
  5. Use robots.txt.
  6. Add breadcrumb menus.
  7. Use pagination.
  8. Check your SEO log files.

Create an XML sitemap

Remember what we reviewed about the site structure? It is part of what is known as an XML sitemap that helps search bots understand and crawl your web pages. You can view it as a map of your site. You will enter your sitemap into Google Search Console and Bing Webmaster Tools once it is complete. Remember to keep your map updated every time you add or delete web pages.

Maximize your tracking time

Your crawl time (also known as the “crawl budget”) refers to the pages and resources that bots will crawl on your site.

Since the crawl time is not infinite, make sure you have prioritized your most important pages for crawling. Here we share some tips to maximize your crawl budget:

  • Remove or integrate duplicate pages.
  • Fix or redirect any broken links.
  • Make sure your CSS and Javascript files are trackable.
  • Review your tracking stats regularly and keep an eye on spikes and drops.
  • Verify that any bots or pages that you have banned from tracking are blocked.
  • Keep your sitemap up to date and submit it to the appropriate webmaster tools.
  • Remove unnecessary or outdated content.
  • Beware of dynamically generated URLs, as they can make your site’s page count too high.

Optimize your site architecture

Your website has multiple pages. Those pages should be organized to allow search engines to easily find and crawl them. This is where the structure of your site (also known as the information architecture of your site) comes into play.

In the same way that a building is constructed from an architectural design, the architecture of your site is the way you organize its pages.

Related pages are grouped together. For example, your blog’s home page links to individual blog posts, which in turn link to the pages of their respective authors. This structure allows search bots to understand the relationship between them.

Your site architecture should also respond to the importance of individual pages. The closer Page A is to your home page, the more pages link to Page A; And the more number of links those pages have, the more importance the bots will give to Page A.

For example, a link from your home page to Page A is more relevant than a link from a blog post. The more links there are to Page A, the more “relevant” it becomes to search engines.

Conceptually, the architecture of a site might look something like this, where the About, Product, News, etc. they are positioned at the top of the page relevance hierarchy.

Make sure that the most important pages of your business are at the top with the most internal (relevant) links.

Establish a URL structure

URL structure refers to the way you structure your URLs, which could be determined by the architecture of your site. We will explain the connection in a moment. First, let’s clarify that URLs can have subdirectories, like blog.hubspot.com, and/or subfolders, like hubspot.com/blog, that indicate where a URL leads.

For example, a blog post titled “How to Take Care of Your Dog” could fall into a blog subdomain or subdirectory. The URL could be www.yoursite.com/blog/care-dog. Whereas a product page on the same site would be www.yoursite.com/product/care-dog

Your URL is entirely up to you, regardless of whether you use subdomains or subdirectories, or “products” instead of “store.” The beauty of creating your own website is that you can set the rules. What matters is that those rules follow a unified structure, so you shouldn’t switch between blog.yoursite.com and yourwebsite.com/blogs on different pages. Make a map, apply it to the name structure of your URLs and stick to it.

Here are other tips on how to write your URLs:

  1. Use lowercase.
  2. Use hyphens to separate words.
  3. Keep them short and descriptive.
  4. Avoid using unnecessary characters or words (including prepositions).
  5. Include your target keywords.
  6. Once you have your URL structure well established, you will enter a list of your important page URLs to search engines in the form of an XML sitemap. By doing so, you give search bots additional context for your site so they aren’t guessing while crawling.

Use robots.txt

When a web robot crawls your site, it will first check the /robot.txt, also known as the robot exclusion protocol. This standard, or protocol, allows or prohibits certain robots from crawling your site, including specific sections or even entire pages. If you want to prevent bots from indexing your site, you will need to use a noindex robots meta tag. Let’s talk about both scenarios.

You may want to block certain bots from crawling your site in general. Unfortunately, there are some bots that have malicious intentions, bots that will damage your content or flood your community forums with spam. If you notice this bad behavior, you should use your robot.txt to prevent them from entering your site. In this scenario, think of robot.txt as a force field for bad internet bots.

On top of indexing, search bots crawl your site to collect clues and find keywords to match your web pages with relevant search results. But, as we will mention later, you have a crawl budget that you do not want to invest in unnecessary data. In that case, you’ll need to exclude pages that don’t help search bots understand what your website is about, for example an offer thank you page or a login page.

Your robot.txt protocol will be unique, depending on what you hope it will accomplish.

Add breadcrumb menus

Do you remember the story of “Hansel and Gretel“, in which two children leave a trail of breadcrumbs on the ground to find their way back home? Well, they were visionaries.

Breadcrumbs, or breadcrumbs, are exactly what they sound like: a trail that takes users back to the beginning of their journey on your website. It’s a menu that tells visitors how the current page relates to the rest of the site.

The breadcrumbs should be two things:

  1. Visible so that users can easily navigate your pages without using the “back” button, and
  2. have a structured markup language so that they give precise context to the bots. search that crawls your site.

Use pagination

Remember when teachers would ask you to number the pages of your essays? This is called paging. In the world of technical SEO, pagination has a slightly different role, but you can still associate it with a way of organizing.

Pagination uses code to tell search engines if pages with different URLs are related to each other. For example, you can have a series of content that you separate into chapters or on several web pages. Use pagination to make it easy for search bots to discover and crawl these pages.

The way it works is pretty straightforward. Go to where it says on page one of the series and use rel = “next” to tell the bot which page to crawl next. Then, on page two, use rel = “prev” to indicate the previous page and rel = “next” for the next, and so on.

Check your SEO log files

Think of your log files like journal entries. The web servers (the writer) record and store the log data, about every action they carry out on your site, within the log files (the journal). The data collected includes the time and date of the request, the content required and the IP address that made the request. You can also recognize the user agent, which is uniquely identifiable software (like a search bot, for example) that fulfils the request for a user.d

But what does this have to do with SEO?

Well, search bots leave a trail in the form of log files when they crawl your site. You can determine when and what was crawled by reviewing the log files and by filtering by the user agent and search engine.

This information is useful to you because you can establish how your crawl budget is spent and what barriers a bot experiences when indexing or entering. To access your log files, you can ask a developer to use a log file analyzer, such as Screaming Frog .

Just because a search bot can crawl your site doesn’t necessarily mean it can index all of your pages. Let’s take a look at the next layer of your technical SEO audit: indexability.

Indexability

As search bots crawl your website, they start indexing pages based on their topics and relevance to that topic. Once your page is indexed it is eligible to rank on the results pages. Here are some factors that can help your pages get indexed.

sitemap, website, seo

Elements of indexability

  • Unblock search bots to access pages.
  • Eliminate duplicate content.
  • Audit your redirects.
  • Check the mobile responsiveness of your site.
  • Fix HTTP errors.

Unblock search bots to access pages

You probably already took care of this step when we mentioned traceability, but it is worth mentioning here. You must ensure that the bots are sent to your favourite pages and that they have free access. You have some tools at your disposal to achieve it. The robots.txt Tester of Google will give you a list of pages that are not allowed, and you can use the inspection tool of Google Search Console to determine which blocks pages.

Remove duplicate content

Duplicate content confuses search bots and negatively impacts your indexability. Remember to use canonical URLs to set your preferred pages.

Audit your redirects

Check that all your redirects are set correctly. Redirect loops, broken URLs or (worse yet) inappropriate redirects can cause problems when your site is indexed. To prevent this, audit all your redirects regularly.

Check the mobile responsiveness of your site

If your website is no longer mobile-friendly, you are further behind than necessary. Since 2016, Google started indexing mobile sites first, prioritizing the mobile experience over the desktop. Today, that indexing is available by default. To keep up with this important trend, you can use Google’s mobile optimization test to review where you need to improve your website.

Fix HTTP errors

HTTP stands for HyperText Transfer Protocol, but maybe that doesn’t matter much to you. What should be important is when HTTP returns errors to your users or search engines, and what you must do to fix them.

HTTP errors can impede search bots by blocking them from relevant content on your site. Therefore, it is very important to address these errors quickly and thoroughly.

Because each HTTP error is unique and requires a specific solution, the next section has a brief explanation of each one, and you can use the links to learn more about them and their solutions.

  • Permanent 301 Redirects: they are used permanently to send traffic from one URL to another. Your CMS will allow you to set these redirects, but too many can slow down your site and degrade the user experience, as each additional redirect adds load time to a page. Aim for no chain of redirects, if possible, as many cause search engines to abandon crawling for that page.
  • Temporary 302 Redirection: It is a way to temporarily redirect traffic from a URL to a different web page. Although this status code will automatically send to the new page, the cached title tag, URL, and description will remain consistent with the source URL. However, when the temporary redirect is held long enough, it will eventually be treated as a permanent redirect and those elements will be passed to the destination URL.
  • 403 Prohibited Messages: means that the content requested by a user is restricted by access permissions or by incorrect server settings.
  • 404 Error Pages: They tell users that the page they have requested does not exist, either because it has been removed or they entered the wrong URL. It is always a good idea to create 404 pages that are branded and help create engagement so that visitors stay on your site.
  • Method Not Allowed 405: This means that your website’s server recognized, and still blocked the access method, resulting in an error message.
  • Internal Server Error 500 – This is a general error message that means your web server is experiencing problems delivering your site to the requesting third party.
  • Bad Gateway 502: It is related to a communication error, or an invalid response, between the servers of a website.
  • Service Not Available 503: it tells you that, although your server works correctly, it is not capable of fulfilling the request.
  • Gateway Timeout Exceeded 504: it means that a server did not receive a response from your web server in time to access the information requested.

Whatever the reason for these errors, it’s important to address them to keep your users and search engines happy. This will ensure that they return to your site.

Even if your site has been crawled and indexed, access issues that block users and bots will have an impact on your SEO. With that said, we must move on to the next stage of your technical SEO audit: rendering.

Rendering

Before delving into this topic, it is important to emphasize the difference between SEO accessibility and web accessibility. The second refers to making your web pages easy to navigate for users with disabilities or impairments, such as blindness or dyslexia for example. Many elements of online accessibility tie in with SEO best practices. However, an SEO accessibility audit does not take into account everything you would need to make your site more friendly to visitors who are disabled.

In this section we are going to focus on SEO accessibility, or rendering, but always keep web accessibility in mind as you develop and maintain your site.

Rendering elements

An accessible site relies on its ease of rendering. Below are the website elements to review for your audit on this topic.

Server performance

As you have already learned, timeouts and server errors will cause HTTP errors that prevent users and bots from accessing your site. If you notice that your server is experiencing problems, use the resources already mentioned to fix them. Failure to do so in a timely manner can cause search engines to remove your web page from their indexes, as it is a bad experience to show a damaged page to the user.

HTTP status

As with server performance, HTTP errors will prevent access to your web pages. You can use a web crawler, such as Screaming Frog, Botify, or DeepCrawl, to perform a deep bug audit on your site.

Loading time and page size

If your page takes a long time to load, the bounce rate won’t be the only problem you’ll have to worry about. A delay in load time can lead to a server error that will block bots from your web pages or cause them to crawl partially loaded versions, which are missing important sections of content. Based on crawl demand for a given resource, bots will spend an equivalent amount of resources trying to load, render, and index pages. However, you should do everything you can to reduce your page load time.

JavaScript rendering

Google has admitted that JavaScript (JS) is difficult for it to process and therefore recommends implementing pre-rendered content to improve accessibility. Google also has resources to help you understand how bots access your site’s JS and provide tips on how to improve search-related problems.

Orphan pages

Each page on your site should be linked to at least one more page (a few more are preferred, depending on how important your page is). When a page does not have internal links it is known as an orphan page. As an article without an introduction, these pages do not have the context that bots need to understand how they should be indexed.

Page depth

Page depth refers to how many layers a page is under in your site structure, that is, how many clicks away from your home page it falls. It is better to take care that the architecture of your site is as superficial as possible while maintaining an intuitive hierarchy. Sometimes a multi-layered site is unavoidable; in that case, prioritize a well-organized site over superficiality.

Regardless of how many layers your site structure has, keep important pages (such as product and contact pages) no more than 3 clicks away. A structure that buries your product page so deep into the site (that users and bots must ask the detective to find it) is less accessible and provides a rather poor experience.

For example, a URL like this (which guides your target audience to your product page) is an example of a poorly planned site structure: www.yourwebsite.com/caracteristicas-de-productos/caracteristicas-por-industria/ airline-case-studies / airline-products.

Redirect chains

You pay a price when you decide to redirect traffic from one page to another. That price is tracking efficiency. Redirects can slow down crawling, reduce the page load time, and make your site inaccessible to redirects that aren’t set correctly. For all these reasons, try to keep redirects to a minimum.

Once you’ve addressed the accessibility issues, you can move on to how your pages rank in the SERPs.

Ranking elements

Now we come to the more current elements that you probably already have in your sights: how to improve ranking from a technical SEO point of view. Getting your pages to qualify involves some of the on-page and off-page elements we’ve already mentioned, but from a technical lens.

seo, google, search

Remember that all of these elements work together to create an SEO friendly site. Therefore, we would be remiss if we leave out some contributing factors. Let’s delve into them.

Internal and external links

Links help search bots understand where a page is in the grand scheme of a search and provide context for rating that page. Links guide search bots (and users) to related content and transfer the importance of a page. Above all, linking with links improves crawling, indexing and your chance of being ranked.

Backlink quality

The backlinks (links from other sites that lead to yours) give a vote of confidence to your site. They tell search bots that External Site A thinks your page is of high quality and deserves to be crawled. If these votes add up, the bots notice and come to your site with greater credibility. However, as with cool things, there is a caveat. The quality of those backlinks matters, and a lot.

Low-quality site links can hurt your rankings. There are many ways to get quality backlinks to your site, such as contacting relevant posts, claiming unlinked mentions, and providing useful content that other sites want to link to.

Content groups

HubSpot has not been shy about our love of content groups or how they contribute to organic growth. Content groups link related content so that search bots can easily find, crawl, and index all the pages you have on a particular topic. They work as a self-promotion tool that shows search engines how much you know about a topic. This way they are more likely to rate your site as an authority for any related search.

Your ranking ability is the main determinant in organic traffic growth. Studies show that search engines are more likely to click on the top three search results in the SERPs. But how do you make sure yours is the result that gets a click?

Let’s talk about this with the final piece of the organic traffic pyramid: clickability.

Clickability

Although click-through rate (CTR) has everything to do with search engine behaviour, there are things you can do to improve your clickability in the SERPs. Since meta descriptions and page titles with keywords do impact CTR, we are going to focus on the technical elements, because that is why you are here.

Click elements

  • Use structured data.
  • Gain SERP functionalities.
  • Optimize featured snippets.
  • Consider Google Discover.

Ranking and CTR rates go hand in hand because, let’s be honest, search engines want immediate answers. The more your result stands out in the SERPs, the more likely you will be clicked. Let’s review some ways to improve your clickability.

Use structured data

The structured data use a specific vocabulary, called schema, in order to categorize and label elements on your web page for search bots. The schema makes it clear what each element is, how it relates to your site, and how it is interpreted. Basically, structured data tells bots “This is a video,” “This is a product,” or “This is a recipe,” leaving no room for interpretation.

To be clear, using structured data is not a “clickability factor” (if such a thing exists), but it helps organize your content in a way that makes it easier for search bots to understand, index, and potentially rate your pages.

Gain SERP functionalities

SERP capabilities, also known as rich results, are a double-edged sword. If you win them and get the click, then everything is perfect. Otherwise, your organic results are pushed behind sponsored ads, reply text boxes, video carousels, and the like.

Rich results are those elements that do not follow the page title, URL, meta description of other search results. For example, the image below shows two SERP functionalities (a video carousel and a “Related Questions” box) above the first organic result.

While you can still get clicks by appearing at the top of organic results, your odds can improve with enriched results.

How do you increase the chances of winning enriched results? Write useful content and use structured data. The easier it is for search bots to understand your site elements, the better your chances of getting a rich result.

Structured data is useful for getting these elements (and others from the search gallery) on your site to appear at the top of the SERPs, thus increasing the chances of a click:

  • Articles
  • Videos
  • Reviews (2)
  • Events
  • Tutorials
  • FAQ (“Frequently Asked Questions”)
  • Images
  • Local business listings
  • products
  • Site Links

Optimize featured snippets

One SERP unicorn feature that has nothing to do with schema markup is Featured Snippets; Those boxes that are above the search results and give concise answers to search queries. Featured Snippets are intended to get search engine responses as fast as possible. According to Google, giving the best answer to a question is the only way to win a snippet.

Consider using Google Discover

Google Discover is a relatively new content-by-category listing algorithm specifically for mobile users. It’s no secret that Google has been focused on the mobile experience; It is also not a surprise that more than 50% of searches come from mobile devices. The tool allows users to build a library of content by selecting categories of interest (gardening, music or politics).

we believes that topic grouping can increase the likelihood of Google Discover being listed, and we actively monitor our Google Discover traffic in the Google Search Console to determine the validity of our hypothesis. We recommend that you also invest some time in this new feature. The payoff is a highly engaged user base that has basically selected, one by one, the content you’ve worked so hard to create.

Technical SEO, on-page SEO, and off-page SEO work together to open the door to organic traffic. Although on-page and off-page techniques are commonly implemented first, technical SEO plays a crucial role in positioning your site at the top of search results and your content in front of your ideal audience. Use these technical tactics to complete your SEO strategy and watch the results unfold.

Author: Jaydip

Jaydip Parikh is Digital Marketer & Founder of Tej SolPro, an Inbound Marketing Company from Ahmedabad, GJ, India. He writes about Blogging, SEO, Social Media, Startup Tips, Digital Marketing insights etc. His Digital Avatar is known as Jaydip Baba. You can follow him @jaydipparikh or connect with him on Facebook.

Leave a Comment

Your email address will not be published. Required fields are marked *

+ 27 = 28