Technical SEO is the work that makes your website easy for search engines to crawl, render, understand and index. If content is what people read, technical SEO is what Googlebot has to deal with first. When the technical foundations are weak, rankings become harder to earn and easier to lose.
A page has to be discoverable (crawled), eligible (not blocked or noindexed), and understood (rendered and interpreted correctly) before it can perform.
This handbook goes deeper than a typical “checklist”. Every term and technique is explained, with UK-relevant implementation advice and real examples.
Technical SEO includes any optimisation that changes how search engines access and process your site. The main areas are:
A useful mental model is that technical SEO reduces friction. It removes barriers between your content and the search engine systems evaluating it.
Before you fix technical issues, you need to know what you are fixing for.
Crawling is when a search engine bot requests a URL and downloads the resources it is allowed to access. The bot is typically “Googlebot” for Google.
Crawling is influenced by:
Rendering is when Google processes HTML, runs JavaScript (if needed), and builds a view of the page similar to what a user’s browser would see. Google can render JavaScript, but it may happen later than the initial crawl, which is why JavaScript-heavy sites can see delays in indexing or missing content if key elements only appear after scripts run.
Indexing is when Google stores information about a URL in its index so it can be retrieved for ranking. If a URL is crawled but not indexed, there is always a reason. Common reasons include “noindex”, duplication, poor quality signals, soft 404s, or blocked resources that stop proper rendering.
Ranking is how Google orders indexed results for a query. Technical SEO does not guarantee rankings. What it does is ensure your best pages are eligible, accessible, and interpretable, so your content and authority signals can actually compete.
If Googlebot cannot reach your important pages, those pages are invisible in organic search. The video below explains this in simple terms.
A robots.txt file is a plain-text file that sits at the root of your domain and provides instructions to crawlers about which paths they are allowed to crawl.
Key terms you will see in robots.txt:
Important meaning:
Real example using StudioHawk UK
Your key commercial pages (for example, https://studiohawk.co.uk/seo-services/technical-seo/ and https://studiohawk.co.uk/contact/) are pages you would typically want fully crawlable. A common technical mistake is accidentally disallowing a directory that contains service pages after a development change. That turns SEO off at the switch without anyone noticing until traffic drops.
An XML sitemap is a machine-readable list of URLs you want search engines to discover and prioritise. It does not force indexing. It improves discovery and helps Google understand your preferred canonical URLs, especially on larger sites.
Key terms:
Best practice:
Example of what search engines like Google expect from your XML sitemap*
<?xml version="1.0" encoding="UTF-8"?> declares the XML format and encoding. It sits at the top of the file and tells search engines how to read the sitemap correctly.
<urlset> is the main container that holds all URLs inside the sitemap. It also includes the sitemap protocol namespace so crawlers understand the structure.
<url> represents one individual page entry. Every URL you want indexed should sit inside its own <url> block.
<loc> contains the full canonical URL of the page. It must be an absolute URL (including https://) and should match the version you want search engines to index.
<lastmod> shows the date the page was last updated. This helps search engines decide when a page might need to be crawled again, especially on large or frequently updated sites.
<changefreq> gives a general hint about how often a page changes, such as daily, weekly or monthly. It is only a suggestion and does not directly affect rankings.
<priority> indicates the relative importance of a page compared to other URLs on the same website, using a value from 0.0 to 1.0. It helps with crawl prioritisation within the sitemap but does not influence SEO rankings.
Real example using StudioHawk UK
StudioHawk UK has a mixture of service pages (for example, https://studiohawk.co.uk/seo-services/ecommerce-seo/ and https://studiohawk.co.uk/seo-services/on-page-seo/) and blog content (for example, https://studiohawk.co.uk/blog/introduction-to-learning-technical-seo/). In a best-practice setup, both content types appear in sitemaps, but you may split them into separate sitemaps (services sitemap, blog sitemap) for cleaner diagnostics and easier monitoring in Google Search Console.
Crawl budget is the combination of:
For most small to mid-sized UK sites, crawl budget is not the first problem. It becomes relevant when:
If you are seeing Googlebot spending time on low-value URLs while important pages update slowly, then crawl budget becomes very real.
Indexability is where many “invisible” problems live. A page can be crawlable but not indexable.
A robots meta tag is a tag in a page’s HTML header that tells crawlers how to treat that page. The most important directive is noindex, which tells Google not to store the page in the index.
Key terms:
Practical meaning:
Real example using StudioHawk UK
Your contact page (e.g. https://studiohawk.co.uk/contact/) is typically an indexable page for branded and commercial intent searches. If it were accidentally set to noindex after a template update, you would likely see:
This is why indexability checks should be part of release QA.
A canonical tag is a link element in the HTML head that tells search engines which URL is the preferred version when multiple URLs have similar or identical content
For example:<link rel="canonical" href="https://example.com/services/seo/" />
Example in context (inside the <head> section):
<head>
<title>SEO Services | Example</title>
<link rel="canonical" href="https://example.com/services/seo/" />
</head>
Key terms:
Canonical tags are hints, not absolute commands. Google may choose a different canonical if signals strongly suggest another URL is better.
Canonicals are essential for:
Why it matters
Without canonicals, Google can index multiple versions of the same page, splitting relevance and internal authority across duplicates. This weakens ranking potential and bloats the index.
Architecture is often treated as “content planning”, but it has direct crawl and index consequences.
A URL is not just a string. It is a signal of hierarchy, intent and page purpose.
Key terms:
A high-quality URL structure tends to be:
Real example using StudioHawk UK
Your service URLs follow a clean folder structure. For example, https://studiohawk.co.uk/seo-services/technical-seo/ clearly communicates page type and topic. This is the type of structure that makes it easier to maintain internal linking, navigate analytics, and scale content without creating a mess of ungrouped pages.
Internal linking is how pages on your own site link to one another.
Key terms:
Practical meaning:
Real example using StudioHawk UK
A strong pattern is linking between relevant services and supporting blog content. If a visitor is on https://studiohawk.co.uk/seo-services/technical-seo/, internal links to relevant guidance articles (and vice versa) help users and search engines understand topical depth. The technical benefit is that it reduces orphaning and improves crawl efficiency.
Breadcrumbs are navigation links that show the user’s position in the site hierarchy (Home > Services > Technical SEO).
Benefits:
Performance, Core Web Vitals, and what Google actually measures
Speed is not one metric. It is a set of user experience measurements.
Core Web Vitals are a set of metrics Google uses to measure real-world page experience for loading performance, interactivity, and visual stability of the page.
The three Core Web Vitals metrics are:
When you measure performance, you will see two data types:
Practical meaning:
If you need to fix Core Web Vitals, the fixes usually come from a handful of techniques. Here is what each term means:
These are not “nice-to-haves”. They are directly tied to user experience and, for competitive queries, can be the difference between page one and page two.
Mobile-first indexing means Google primarily uses the mobile version of the content for indexing and ranking.
Practical meaning:
When you audit mobile, look at:
Always check that the mobile version of your website is optimised and functions as it should (see example below)
Duplication is often created by systems, not people.
A URL parameter is anything after a “?” in a URL (for example,?sort=price or ?utm_source=newsletter).
Parameters are used for:
The risk is that parameters can create many URLs that show the same or near-identical content. That can lead to index bloat, wasted crawl activity, and diluted relevance.
Pagination is when a list of items spans multiple pages (Page 1, Page 2, Page 3).
Best practice is typically:
Faceted navigation is common on e-commerce sites and allows filtering by attributes (size, colour, price, brand). Each filter combination can create a new URL.
A crawl trap is when bots can generate near-infinite URL combinations through filters and sorts. This wastes crawl capacity and can flood the index with thin duplicates.
Controls include:
Structured data is code added to a page that describes entities and properties in a standard format. The most common format is JSON-LD, which Google recommends in its structured data documentation.
Below is a JSON-LD Blog Posting Schema example
Practical meaning:
Real example using StudioHawk UK
Pages like https://studiohawk.co.uk/seo-services/technical-seo/ are good candidates for Organisation and Service-related structured data, plus breadcrumbs. Blog posts can use Article schema. The goal is not “more schema everywhere”. The goal is accurate schema that matches visible content and supports understanding.
JavaScript is not a ranking problem by default. The problems happen when key content and links are not present in the initial HTML.
Key terms:
Practical meaning:
Status codes tell crawlers what happened when they requested a URL.
Key terms you must understand:
Redirect concepts:
Example
If you ever restructure service URLs (for example, renaming a folder), a clean 301 redirect from the old URL to the new URL protects existing rankings and backlinks. That is migration hygiene, not optional.
HTTPS encrypts data between the browser and the server. It is also a trust signal and a basic standard for modern sites.
Key terms:
Practical meaning:
If you target multiple countries or languages, hreflang is a technical requirement.
Key terms:
Incorrect hreflang can cause the wrong version to rank in the UK, or can suppress visibility due to conflicting signals.
A technical audit is not “run a crawl and export errors”. A good audit connects issues to outcomes: rankings, crawling efficiency, conversion rate, and index quality.
A thorough audit typically includes:
Google Search Console is not optional for this. It is your direct view into Google’s indexing and UX reporting, including the Core Web Vitals report.
Many technical SEO disasters happen during routine changes. Templates get edited, plugins update, or a staging config leaks into production.
High-risk changes include:
A basic safeguard process is:
Real example using StudioHawk UK
Your highest-intent pages are usually service pages and the contact page. If you were doing a redesign, those pages (for example, https://studiohawk.co.uk/seo-services/technical-seo/ and https://studiohawk.co.uk/contact/) should be in the “must test” list for indexability (no accidental noindex), canonical integrity, performance, and mobile UX.
Here are issues that repeatedly cause traffic drops:
The main takeaway is that technical SEO is not “one big fix”. It is continuous maintenance and intentional site management.
If you're unsure where to begin or want expert support to build a content strategy that actually delivers results, speak to the team at Studiohawk. We'll help you create and maintain content that remains relevant, useful, and optimised for long-term growth.
Contact our SEO experts today.