What Should a Tech SEO Audit Look At?

October 02, 2020

When you are conducting your technical SEO audit, you want to make sure you are doing more than checking items off a list. Too often, I see audits that only report on metrics—how many 404s are there, how many pages with missing titles, how many pages with missing or short descriptions, what the key speed metrics are, and so on. Don’t get me wrong; these metrics are important. Every tech SEO audit should, of course, look at 404s, titles and descriptions, speed, etc.

These formulaic audits aren’t all that beneficial because they stay on the surface instead of digging into the problems facing the website. The checklist itself becomes the focus—did I look at every item listed? Instead, you want a tech SEO audit that keeps the focus on the website instead of on a checklist. Your tech SEO audit process should allow you to stray from that checklist to look at the specific items and issues affecting the website. As well, your audit process should allow you to ignore some items on that checklist altogether if they don’t apply or if those items aren’t that high a priority.

In short, an audit shouldn’t be seen as a “checklist” at all. Rather, an audit is an opportunity to explore the technical aspects of the website and determine what are the highest priority items holding the website back from performing better in search results. Of all the technical SEO audits we’ve conducted at Elementive, no two look the same or covered the same ground—even when we’ve done multiple audits for the same client.

Instead of a formulaic checklist, you need guiding questions to help you think through and explore the website more deeply. Let’s talk through what those guiding questions are and how they can help you structure your audit. This is by no means comprehensive but can give you an idea of where to get started when conducting your audit.

Question #1: Is the website crawlable?

Most fundamentally, can a bot access and crawl the website? When a bot moves through the website, can they discover all the pages contained on the website? Are bots seeing all the images? Are they using the navigation and internal links? Are you doing your best to make those images and text understandable and findable? Answering these questions requires looking at things like:

  1. The total number of pages Google has found on the website relative to the total volume of pages there are to be found, including understanding what pages or sections of the website bots may be missing.
  2. Determining if the XML sitemap is well structured, error-free, and includes the key pages. As part of this, you also want to understand if the XML sitemap is being used by bots.
  3. Understanding the crawl patterns you see from Googlebot within your log files and identifying where and why problems may exist.
  4. Reviewing robot control statements to ensure this website is guiding bots correctly via noindex, nofollow, and disallow (and how Google is responding to those directives).
  5. Reviewing the HTML within each page to ensure text is readable by bots, links are followable, and images are understandable.

Question #2: Is the website indexable?

Assuming a bot can crawl the website, you next want to make sure that your pages all look as high quality as possible from a technical point of view. You may have great content somewhere on the website, but are you bogging that content down with so many technical mishaps that bots are discounting that content? This requires answering looking at things like:

  1. The number of errors that exist on the website, including not-found 404 errors and server or 5xx errors. Beyond knowing that errors exist, you want to know why these errors are occurring and how big of a problem they present (or don’t present, as the case may be).
  2. Are there lots of links (internal or external) pointing to redirects? Are there redirect loops or chains present? How is this affecting search performance or bot crawl activity—is it?
  3. The website’s speed—including the speed on key pages. More than understanding if the website meets the core web vitals, what is the overall experience of loading this website on different connection speeds?
  4. How many pages on the website are junk content—duplicate content, thin content, doorway pages, etc.—and how many of those pages is Google finding?
  5. Does the website comply with mobile-friendliness guidelines? Beyond that, what is the overall user experience on mobile devices—how can it be fixed?

Question #3: Is the website ranking?

Now that Google has crawled the website and indexed it, you need to review how the website is performing within the search results? Is there anything that can be done technically to help it perform better? Answering these questions requires looking at things like:

  1. Are title tags and descriptions being used by Google in search results? If not, why not? If Google isn’t using the meta description tags written by the website (as is often the case these days), what content is being pulled in to the search result?
  2. Is schema being used to mark up the content? If so, is it being pulled in to Google’s search results?
  3. What else is being surfaced in the search results and is all that information accurate and appropriate to include? That includes fraggles, images, videos, links in the knowledge panel, and more.
  4. Are nosnippet or max-snippet tags used on the website? If so, are they being respected by Google? Are these tags used appropriately or to the website’s detriment?
  5. How do competitors compare technically? Is there something that makes a competitor’s site better technically than the website you are auditing?

Final Thoughts

There are a lot of questions to explore and the questions listed here are only just the beginning. My main point to establish is that a tech SEO audit shouldn’t be a simple checklist that you work through. None of the questions listed in this post are items that can be quickly reviewed but instead require more research and investigation. That is what a tech SEO audit should be; an audit helps you explore the website and explore how the website is performing in search results.

If you need help auditing a website’s technical SEO performance, check out Tech SEO Guide—a practical guide walking through the fundamental aspects of technical SEO. You can also check out my Technical SEO Fundamentals course. Or, if you’d like to hire Elementive to audit your website’s technical SEO performance, please contact me directly.

You may also like

How To Find & Fix Duplicated Content

Put simply: Duplicate content confuses both human and robot visitors. Let’s walk through how we deal with duplicated content: locating, evaluating, and resolving.

How To Find & Fix 404 Errors On Your Website

How do you find the 404 errors on your website? Once found, how do you fix the 404 errors? What tools can help? Find out in this in-depth 404 guide!

What is Core Web Vitals?

What are Google’s Core Web Vitals metrics? How can these affect your website’s performance? How do you optimize your website?