A 32-point technical SEO checklist covering crawlability, indexation, site speed, Core Web Vitals, schema markup, XML sitemaps, robots.txt, canonical tags, redirect chains, mobile-first indexing, HTTPS, and hreflang. This is the infrastructure layer that makes or breaks your rankings.
Last updated: March 2026 · Reading time: 14 min
32 infrastructure checks across 6 categories with tool recommendations.
A technical SEO checklist covers the infrastructure decisions that determine whether search engines can find, crawl, render, and index your pages correctly. Content quality doesn’t matter if Googlebot can’t reach your pages. On-page optimization doesn’t matter if your site takes 8 seconds to load. Technical SEO is the foundation that everything else sits on.
Technical SEO is the process of optimizing a website’s infrastructure so that search engines can efficiently crawl, render, index, and rank its pages. It covers server configuration, URL management, page speed, structured data, and site architecture.
This checklist gives you:
This checklist pairs with our broader 47-point SEO checklist, which also covers on-page, off-page, local, and AI visibility.
If Googlebot can’t reach a page, that page doesn’t exist in search results. Crawlability is the first thing to verify. Google’s Gary Illyes confirmed at PubCon 2023 that crawl budget is real and finite: large sites (10,000+ URLs) must be intentional about what they allow Google to crawl. Even smaller sites can block critical pages with a single misconfigured robots.txt line.
| # | Check | Priority | How to verify |
|---|---|---|---|
| 1 | robots.txt exists at /robots.txt and is accessible (200 status) | P1 | Visit yourdomain.com/robots.txt directly. If it returns 404 or 500, fix immediately. |
| 2 | robots.txt does not block important pages, CSS, or JS files | P1 | Use Google Search Console’s robots.txt Tester. Common mistake: blocking /wp-admin/ also blocks /wp-admin/admin-ajax.php which WordPress uses for AJAX. |
| 3 | Crawl errors in Search Console are under 0.5% of total URLs | P1 | Check Coverage > Errors in Search Console. Server errors (5xx) and 404s on important pages need immediate attention. |
| 4 | JavaScript-rendered content is visible to Googlebot | P2 | Use Search Console’s URL Inspection tool > “View Tested Page.” If your content appears blank, Google can’t see JS-rendered content. Consider server-side rendering. |
| 5 | Crawl budget not wasted on low-value pages (faceted navigation, internal search results, parameter URLs) | P2 | Check Search Console > Settings > Crawl Stats. If Google is crawling thousands of parameter URLs, use robots.txt or canonical tags to consolidate. |
We audit robots.txt on every new client site. In Q4 2025, 3 out of 10 client sites had rules that were blocking CSS or JavaScript resources. Google needs to render your CSS and JS to understand your page layout. Blocking these resources means Google sees a broken page, which hurts both ranking and Core Web Vitals scoring.
Indexation is about control. You want Google to index your valuable pages and ignore the junk. The typical enterprise site has 30-40% index bloat: pages in Google’s index that shouldn’t be there (tag pages, paginated archives, parameter URLs). This dilutes your site’s overall quality signals and wastes crawl budget.
| # | Check | Priority | How to verify |
|---|---|---|---|
| 6 | Self-referencing canonical tags on every indexable page | P1 | Every page should have a <link rel="canonical" href="..."> pointing to its own URL. Screaming Frog > Canonicals tab to audit. |
| 7 | No conflicting signals (canonical says index, meta tag says noindex) | P1 | When canonical and noindex conflict, Google picks one. The result is unpredictable. Remove the conflict. |
| 8 | Duplicate content consolidated via canonical tags (not 301s) | P1 | HTTP vs HTTPS, www vs non-www, trailing slash vs non-trailing slash. All variants should canonical to one version. |
| 9 | Thin or duplicate pages set to noindex (tag pages, search results, archives) | P2 | Run a site:yourdomain.com search in Google. If you see pages that shouldn’t be ranking (or confusing users), noindex them. |
| 10 | Indexation count in Search Console matches expected page count (within 10%) | P2 | If Google has indexed 5,000 pages but you only have 500 worth indexing, you have an index bloat problem. |
“Canonical tag problems are the single most common technical SEO issue we find. Nearly 40% of the sites we audit have conflicting canonicals, missing canonicals, or canonicals pointing to the wrong URL. One client had 12,000 pages indexed when only 800 should have been. We cut the index down and their average position improved by 4.2 points in 6 weeks.”
Hardik Shah, Founder of ScaleGrowth.Digital
Core Web Vitals became a ranking factor in 2021 and the thresholds got tighter with Google’s switch from FID to INP (Interaction to Next Paint) in March 2024. The three metrics that matter: LCP under 2.5 seconds, INP under 200 milliseconds, CLS under 0.1. According to Chrome UX Report data (2025), only 43% of websites pass all three thresholds on mobile.
Core Web Vitals are three specific page performance metrics that Google uses as ranking signals: Largest Contentful Paint (loading speed), Interaction to Next Paint (responsiveness), and Cumulative Layout Shift (visual stability).
| # | Check | Priority | Target threshold |
|---|---|---|---|
| 11 | LCP (Largest Contentful Paint) under 2.5 seconds | P1 | Good: <2.5s. Needs improvement: 2.5-4s. Poor: >4s. Common fixes: optimize hero image, preload LCP resource, reduce server response time. |
| 12 | INP (Interaction to Next Paint) under 200ms | P1 | Good: <200ms. Needs improvement: 200-500ms. Poor: >500ms. Common fixes: reduce JavaScript execution, break up long tasks, defer non-critical JS. |
| 13 | CLS (Cumulative Layout Shift) under 0.1 | P1 | Good: <0.1. Needs improvement: 0.1-0.25. Poor: >0.25. Common fixes: set width/height on images and videos, avoid injecting content above existing content. |
| 14 | Server response time (TTFB) under 800ms | P2 | Time To First Byte should be under 800ms. If it’s over 1.5s, investigate server configuration, database queries, and caching. |
| 15 | Critical rendering path optimized: CSS inlined or preloaded, render-blocking JS deferred | P2 | Use defer or async on non-critical JavaScript. Inline above-the-fold CSS. Move everything else below the fold. |
| 16 | Resource hints implemented: preconnect, preload, prefetch for critical resources | P2 | Preconnect to third-party origins (Google Fonts, analytics). Preload the LCP image and critical fonts. |
The most impactful fix we’ve made across client sites is LCP optimization. In 8 out of 10 cases, the LCP element is a hero image. Switching from PNG/JPEG to WebP, adding width/height attributes, and using fetchpriority=”high” typically drops LCP by 1-2 seconds. One client went from 4.8s LCP to 1.9s with these three changes alone.
Site architecture determines how efficiently Googlebot discovers and prioritizes your pages. The goal: every important page should be reachable within 3 clicks from the homepage. Pages buried 5+ clicks deep get crawled less frequently and rank worse. Your XML sitemap serves as a backup discovery mechanism, but it doesn’t replace good internal linking.
| # | Check | Priority | How to verify |
|---|---|---|---|
| 17 | XML sitemap submitted to Google Search Console and Bing Webmaster Tools | P1 | Check Search Console > Sitemaps. Status should show “Success” with recent processing date. Sitemap should include only indexable, canonical URLs. |
| 18 | XML sitemap contains only 200-status, indexable pages (no 404s, no noindexed pages) | P1 | Crawl your sitemap URLs with Screaming Frog. Filter for non-200 status codes. Remove any that aren’t valid, indexable pages. |
| 19 | All important pages reachable within 3 clicks from homepage | P2 | Screaming Frog > Crawl Depth report. Pages at depth 4+ should be reviewed. Consider adding hub pages or navigation links to reduce depth. |
| 20 | URL structure is flat and descriptive (yourdomain.com/category/page-name/) | P2 | Avoid deep nesting like /blog/2026/03/15/post-name/. Flat URLs (/blog/post-name/) perform better and are easier to manage. |
| 21 | Breadcrumb navigation implemented with BreadcrumbList schema | P2 | Breadcrumbs help users and search engines understand site hierarchy. Add BreadcrumbList schema for rich results in SERPs. |
A common architecture mistake: relying on your sitemap to do all the work. Google’s documentation says sitemaps are a “hint,” not a directive. If a page is in your sitemap but has zero internal links pointing to it, Google may still ignore it. Internal links are the primary discovery mechanism. The sitemap is the safety net.
Redirect chains lose link equity at every hop. Moz’s research (2022) estimated that each 301 redirect passes roughly 85-95% of PageRank, but chains of 3+ redirects can lose 30-40% of the original value. HTTPS, meanwhile, has been a confirmed ranking signal since 2014. Any site still serving HTTP pages is leaving both rankings and user trust on the table.
| # | Check | Priority | How to verify |
|---|---|---|---|
| 22 | No redirect chains (max 1 hop from origin to final URL) | P1 | Screaming Frog > Redirect Chains report. Every chain should be flattened to a single redirect from origin to final destination. |
| 23 | All HTTP URLs 301 redirect to HTTPS equivalents | P1 | Test by visiting http://yourdomain.com. It should immediately redirect to https://. Check all variants: http://www, http://non-www. |
| 24 | No mixed content warnings (all resources loaded over HTTPS) | P1 | Chrome DevTools > Security tab. Any resource loaded over HTTP on an HTTPS page triggers a mixed content warning and may be blocked by browsers. |
| 25 | Old URLs from redesigns/migrations redirect to correct new URLs (not all to homepage) | P2 | Lazy redirects to the homepage waste the link equity of the old URL. Map each old URL to its closest equivalent on the new site. |
| 26 | SSL certificate is valid, not expired, and covers all subdomains | P2 | Use SSL Labs’ free test (ssllabs.com/ssltest). Grade should be A or A+. Check certificate expiry date and set a renewal reminder. |
During site migrations, redirect mapping is the most time-consuming and most important task. We’ve seen sites lose 40-60% of organic traffic after a migration because redirects were misconfigured. For our complete migration planning, we create a 1:1 URL mapping spreadsheet before any changes go live.
Google switched to mobile-first indexing for all sites in 2023. This means Google uses the mobile version of your page for indexing and ranking. If your mobile page has less content, fewer internal links, or missing structured data compared to desktop, your rankings will suffer. For international sites, hreflang tags prevent the wrong language version from appearing in search results.
| # | Check | Priority | How to verify |
|---|---|---|---|
| 27 | Mobile and desktop versions have identical content (same text, images, structured data) | P1 | Compare mobile and desktop rendered pages using Search Console’s URL Inspection. Any content hidden on mobile (tabs, accordions) may not be indexed. |
| 28 | Responsive design with no horizontal scrolling on any screen size | P1 | Test on Chrome DevTools with multiple device presets. Common problem: fixed-width tables that overflow on mobile. |
| 29 | Tap targets are at least 48x48px with 8px spacing | P1 | Google’s mobile usability report flags small tap targets. Links too close together cause accidental clicks and frustrate users. |
| 30 | Font size minimum 16px for body text on mobile | P2 | Text smaller than 16px triggers mobile usability warnings and hurts readability. |
| 31 | Hreflang tags implemented correctly for multi-language sites | P2 | Each language version must reference all other versions AND itself. Hreflang is reciprocal: if page A points to page B, page B must point back to page A. |
| 32 | Hreflang x-default set for the primary/fallback language | P2 | The x-default tells Google which version to show when no other language matches the user’s locale. |
Hreflang is one of the most error-prone implementations in SEO. Ahrefs analyzed 9,815 websites using hreflang tags (2024) and found that 75% had at least one implementation error. The most common mistake: non-reciprocal tags. If your English page references the Spanish version, the Spanish page must also reference the English page. Missing this reciprocal link breaks the entire hreflang signal for both pages.
A technical SEO audit needs data before opinions. Start with a full site crawl, then work through this checklist with the crawl data open. Don’t guess. Every check in this list has a specific verification method.
Step 1: Crawl your site with Screaming Frog or Sitebulb. Set the crawler to respect robots.txt and follow redirects. For sites under 500 URLs, the free version of Screaming Frog works. Above that, you’ll need the paid license ($259/year). Export the full crawl data.
Step 2: Pull Core Web Vitals data from CrUX. Use PageSpeed Insights (pagespeed.web.dev) for page-level data or the CrUX dashboard in BigQuery for site-wide field data. Lab data (Lighthouse) shows potential issues. Field data (CrUX) shows what real users experience.
Step 3: Review Search Console. Check Coverage report for indexation issues, Sitemaps for submission status, and Mobile Usability for responsive design problems. Export the list of excluded URLs and categorize them.
Step 4: Work through this 32-point checklist. Start with P1 items across all six categories. Fix critical issues (broken redirects, canonical conflicts, robots.txt blocks) before optimizing speed or architecture.
Step 5: Document everything in the Google Sheets version. Record the current state, the fix needed, the responsible person, and the deadline. Technical SEO fixes usually require developer involvement, so clear documentation prevents miscommunication.
At ScaleGrowth.Digital, we complete technical audits within the first 48 hours of any new engagement. The crawl data and this checklist together take about 4-6 hours for a site with 500-5,000 pages.
Get the Google Sheets version with pass/fail scoring, priority labels, and tool recommendations.
Includes a summary dashboard with your technical SEO health score.
No email required. Instant access.
Technical SEO problems are silent. They don’t produce visible errors on the page. Your content looks fine. Your design looks fine. But behind the scenes, redirect chains are leaking authority, canonical conflicts are splitting your ranking signals across duplicate URLs, and Googlebot is spending half its crawl budget on pages you don’t even want indexed.
We’ve audited sites with excellent content that couldn’t break page 2 for their target keywords. The reason was always technical. One SaaS client had 47 redirect chains, some 5 hops deep. Flattening those chains to single redirects and fixing 200+ broken canonical tags moved their primary keyword from position 23 to position 8 in six weeks, without writing a single new piece of content.
The other pattern we see: teams running Lighthouse audits and celebrating a 90+ performance score while their field data (real user metrics) shows failing Core Web Vitals. Lighthouse tests in ideal conditions on a fast machine. CrUX data shows what your users actually experience on their phones over variable connections. Always check field data. Lab scores are useful for debugging, but field data is what Google uses for ranking.
Pair this technical checklist with these resources for complete SEO coverage.
The comprehensive SEO checklist covering technical, on-page, off-page, local SEO, content quality, and AI visibility.
A 27-point checklist for title tags, headers, content optimization, images, internal links, and schema markup.
A monthly reporting template that includes technical health tracking alongside traffic, rankings, and backlink metrics.
Canonical tag problems are the most common technical SEO issue. Nearly 40% of the sites we audit at ScaleGrowth.Digital have conflicting, missing, or incorrect canonical tags. This leads to duplicate content in Google’s index, diluted ranking signals, and wasted crawl budget. Run a Screaming Frog crawl and filter the Canonicals tab for mismatches.
Run a full technical audit quarterly and after any major site change: CMS migration, redesign, new subdomain, or URL structure change. Between full audits, monitor Search Console weekly for new crawl errors or indexation drops. Set up automated alerts in Search Console for coverage issues.
Essential tools: Google Search Console (free), Screaming Frog SEO Spider (free for up to 500 URLs, $259/year for full version), and PageSpeed Insights (free). For advanced audits: Sitebulb ($150+/year) for visual crawl analysis, Ahrefs or SEMrush for backlink and crawl data, and Chrome DevTools for debugging render and performance issues.
Yes. Core Web Vitals are a confirmed ranking factor. HTTPS is a confirmed ranking factor. Mobile-friendliness affects mobile search rankings. Crawlability and indexation directly determine whether pages can rank at all. While content quality and backlinks have the largest impact on rankings, technical issues can block those signals from reaching Google.
Lab data comes from tools like Lighthouse that test your page in a controlled environment (fixed network speed, fixed device specs). Field data comes from real users via the Chrome UX Report (CrUX) and reflects actual performance across different devices and connections. Google uses field data for ranking decisions. Lab data is useful for diagnosing issues, but field data is what counts.
Our SEO practice runs a full technical audit within 48 hours: crawl analysis, Core Web Vitals assessment, indexation review, and a prioritized fix plan.