Technical SEO is the least glamorous part of SEO. There’s no creative work, no interesting writing, no relationship building. It’s diagnosing problems that are invisible to the average site visitor but critical to how search engines crawl, understand, and rank your pages. I run technical audits on new client sites regularly, and the same issues come up again and again — even on sites that have had “SEO done” by previous agencies.
This checklist is what I actually run through. If your site passes everything here, the technical foundation is solid and you can focus your energy on content and links.
Crawlability and Indexation
- XML Sitemap: Exists, submitted to Google Search Console and Bing Webmaster Tools, only includes canonical URLs, doesn’t include noindexed pages
- Robots.txt: Not accidentally blocking important pages or resources (CSS, JS). Check by fetching and reading it directly at /robots.txt
- Crawl errors: Zero 404 errors on crawled pages. Zero 5xx server errors. Check in GSC under Coverage
- Redirect chains: No chains longer than one redirect. 301s only (not 302 for permanent moves). Verify with Screaming Frog or similar
- Canonical tags: Every page has a self-referencing canonical (or points to the preferred canonical for duplicates)
- Indexation audit: Run a site: query in Google and compare to total pages. Significant discrepancy means something is blocking indexation
URL Structure
- Short, descriptive URLs — no query parameters in public-facing URLs where avoidable
- Lowercase letters only in URLs
- Hyphens not underscores as word separators
- No unnecessary subfolders (/blog/category/subcategory/post-name is too deep)
- HTTPS on all pages — no mixed content warnings
On-Page Technical Elements
- Title tags: Unique on every page, under 60 characters, includes primary keyword near the front
- Meta descriptions: Unique on every page, 140-160 characters, compelling CTR hook
- H1 tags: Exactly one H1 per page, includes primary keyword
- Image alt text: Descriptive alt text on every image that conveys information. Decorative images get empty alt=””
- Internal links: No orphaned pages (every page linked to from at least one other page). No broken internal links
Core Web Vitals
Measure these with PageSpeed Insights (real-world data) and Google Search Console’s Core Web Vitals report, not just Lighthouse lab scores:
- LCP (Largest Contentful Paint): Under 2.5 seconds. Fix by: optimizing the largest image on the page, preloading critical resources, improving server response time
- INP (Interaction to Next Paint): Under 200ms. Fix by: reducing JavaScript execution time, deferring non-critical scripts, eliminating long tasks
- CLS (Cumulative Layout Shift): Under 0.1. Fix by: specifying image dimensions, avoiding dynamically injected content above existing content, reserving space for ads/embeds
Mobile Usability
- Passes Google’s Mobile-Friendly Test
- No horizontal scrolling on mobile viewports
- Tap targets (buttons, links) minimum 48px x 48px
- Text legible without zooming (minimum 16px body font)
- No interstitials that cover main content on mobile (unless compliant with Google’s exceptions)
Schema Markup
Schema is one area where I see more overclaiming of impact than almost anywhere in SEO. Let me be direct: basic schema does not directly boost rankings. It can help Google understand your content better, improve your search result appearance with rich results, and provide indirect ranking benefits. Prioritize:
- LocalBusiness schema on homepage and contact page (for local businesses)
- Article schema on blog posts
- FAQPage schema on pages with Q&A content
- Product schema on product pages with price, availability, and review data
- BreadcrumbList on all non-homepage pages
Validate everything with Google’s Rich Results Test before considering it done.
Security and Technical Infrastructure
- SSL certificate valid and not expiring within 30 days
- HTTP/2 or HTTP/3 enabled (your host should handle this)
- Secure headers configured (X-Content-Type-Options, X-Frame-Options, Content-Security-Policy)
- No sensitive information exposed in page source (API keys, internal paths)
International and Multi-Location Sites
If applicable: hreflang tags implemented correctly for multilingual content. Separate URLs for each language/region (not just translated text on the same URL). Geographic targeting set in Google Search Console.
For help running a full technical audit, contact me here. My SEO services include comprehensive technical audits as part of every engagement. More technical SEO detail on the blog.
Frequently Asked Questions
What is technical SEO and why does it matter?
Technical SEO refers to the optimization of your website’s infrastructure so search engines can efficiently crawl, understand, and index your content. It encompasses crawlability, site speed, mobile usability, structured data, URL structure, and security. Without solid technical foundations, even excellent content and strong backlinks can fail to rank — if Google can’t properly crawl your site or if pages are flagging technical errors, content quality becomes irrelevant.
How do I run a technical SEO audit?
Start with Google Search Console (check Coverage, Core Web Vitals, and Mobile Usability reports). Then run a site crawl using Screaming Frog or Sitebulb to identify crawl errors, duplicate titles, missing meta descriptions, broken links, and redirect chains. Run PageSpeed Insights on your top pages. Check your robots.txt and sitemap manually. Validate your schema markup with Google’s Rich Results Test. Compile all issues into a prioritized list and address the highest-impact items first.
What are Core Web Vitals and do they affect rankings?
Core Web Vitals are Google’s metrics for measuring user experience: LCP (loading speed of the largest content element), INP (responsiveness to user interactions), and CLS (visual stability). Google confirmed they’re a ranking signal as part of their Page Experience update. They function as a tiebreaker — when two pages are otherwise equally relevant, the one with better Core Web Vitals scores will typically rank higher. In competitive niches, failing Core Web Vitals is a meaningful disadvantage.
How do I fix a slow website for SEO?
Start by identifying the specific issue: run PageSpeed Insights and look at the opportunities section. Common fixes include: compressing images to WebP format (often the single biggest win), eliminating render-blocking JavaScript and CSS, implementing browser caching, upgrading to faster hosting or a CDN, reducing WordPress plugins, and preloading critical fonts and images. Each of these can produce measurable LCP improvements. Address one at a time and measure impact with PageSpeed Insights after each change.
What is a robots.txt file and what should it include?
Robots.txt is a text file at your domain root (/robots.txt) that tells search engine crawlers which pages or sections of your site they can or cannot crawl. It should not block important CSS, JavaScript, or image files that Google needs to render and understand your pages. It should block admin areas, staging content, and duplicate parameter URLs if present. Common mistake: WordPress sites that accidentally block all crawlers if the “Discourage search engines” setting is enabled during development and never disabled.
What is duplicate content and how do I fix it?
Duplicate content refers to identical or substantially similar content appearing at multiple URLs. Common causes on WordPress: www vs. non-www versions, HTTP vs. HTTPS versions, trailing slash vs. no trailing slash, pagination, category/tag archive pages, and printer-friendly versions. Fix by: implementing consistent canonicalization, using 301 redirects for URL variants, and setting self-referencing canonical tags on all pages. Serious duplicate content issues can confuse Google about which page to rank and dilute link equity.








