Recommended Quarterly Technical SEO Tasks
Last Updated: May 12, 2021
The technical side of search engine optimization matters to all websites. Big or small, local or national or international, B2B or B2C. No matter the type of business or vertical you are in, if you want organic traffic coming to your website, you need to make sure that Google’s bots can understand your website.
Our recommendation at Elementive is to check technical SEO factors on a regular basis through an SEO website audit. We typically recommend performing an SEO website audit at least once every quarter, though busier websites can benefit from monthly or weekly checks too. By regularly auditing your website’s performance, you’ll know that your website is in its best possible shape and that no new changes, like a server upgrade or WordPress update, derailed your search engine marketing efforts.
In this article, I will cover the top five technical factors that you need to check at least every quarter and the guiding questions you should use to help you think through and explore the website more deeply. Before we dig into the technical SEO tasks and insights you should complete each quarter, let’s start by discussing what a technical SEO audit should and should not be.
What a Technical SEO Audit Is and Is Not
When you are conducting your quarterly technical SEO audit, you want to make sure you are doing more than checking items off a list. Too often, I see audits that only report on metrics—how many 404s are there, how many pages with missing titles, how many pages with missing or short descriptions, what the key speed metrics are, and so on. Don’t get me wrong; these metrics are important. Every technical SEO audit should, of course, look at 404s, titles and descriptions, speed, etc.
The problem is that these formulaic audits aren’t all that beneficial because they stay on the surface instead of digging into the problems facing the website. The checklist itself becomes the focus—did I look at every item listed? Instead, you want a technical SEO audit that keeps the focus on the problems facing this website instead of simply moving through a checklist. Your technical SEO audit process should allow you to stray from that checklist to look at the specific items and issues affecting the website. As well, your audit process should allow you to ignore some items on that checklist altogether if they don’t apply or if those items aren’t that high a priority.
In short, an audit shouldn’t be seen as a “checklist” at all. Rather, an audit is an opportunity to explore the technical aspects of the website and determine what are the highest priority items holding the website back from performing better in search results. Of all the technical SEO audits we’ve done at Elementive, no two look the same or covered the same ground—even when we’ve done multiple audits for the same client.
As an alternative to the formulaic checklist, let’s review the 5 quarterly SEO tasks and three guiding questions that will help you complete an actionable audit that uncovers technical optimization opportunities.
Quarterly Technical SEO Tasks
Task #1: Find & Fix 404 Not-Found Errors
As you update content, change images, or remove old pages, you’ll inevitably create 404 or not-found errors. As well, Google will occasionally find bad links referencing pages on your website. While a few 404s won’t hurt your SEO performance, a lot can. So, the best practice is to routinely identify and address the errors.
There are a few ways to find 404 not-found errors, though one of the easiest is using Google Search Console. Under the Coverage report, select Excluded, and then scroll down to see if there are any not-found pages listed. If so, click “Not Found (404)” to see the full list of errors Googlebot has found related to your site.
PRO TIP: Review Soft 404s and fix those with every audit. Learn more about what soft 404s are.
Task #2: Remove & Revise Low-Quality Content
You don’t want bad content to ruin good SEO. Content management systems produce a lot of automated pages that serve little to no value to users or search engine robots, including tag and category pages. Of course, it isn’t just automated systems that do this—perhaps you created a lot of placeholder pages that ultimately never performed how you expected. While auditing your website’s technical SEO factors, you want to find this type of low-quality content out and prune it.
It isn’t just about removing the low-quality content. You’ll likely have older and outdated content on your website that has outlived its value and purpose. Removal might be the right option, but with a revision to the old material, you can make those pages valuable again. So, during the audit, don’t only focus on what low-quality content to remove, but what you can revise as well.
PRO TIP: As you review your content, also seek out any duplicate content—especially duplicate content that could appear spammy or manipulative.
Task #3: Address Errors in XML Sitemap
XML sitemaps are often neglected—once set up to run automatically, many webmasters and SEOs never think about them again. Yet, improving the quality of the XML sitemap can have noticeable impact on Google’s ability to crawl the website and, by extension, a noticeable impact on your website’s performance in Google search results.
To make XML sitemaps as useful as possible, the pages listed in the XML sitemap need to be valid pages that a search engine could potentially index. During each audit, check to make sure your XML sitemap does not list URLs to pages that:
- Redirect somewhere else
- Return an error message
- Are duplicates (or near duplicates) of other pages
- Contain low-quality or thin content
- Contain noindex or disallow commands
- Should otherwise not be crawled or indexed
PRO TIP: To help identify issues in your XML sitemap, go to the Sitemaps area in Google Search Console and check to see what, if any, errors Google has identified on the sitemap.
Task #4: Test & Optimize Website Speed
One of the best parts about conducting regular audits is that you can compare how your website’s performance is changing over time. A good example of this is speed. With regular audits, you can tell if your website’s load time has gotten faster, slower, or remained the same.
During each audit, test the speed of your website’s top pages including the speed metrics embedded in Google’s Core Web Vitals. It can be helpful to record the load times each quarter (or however often you conduct the audit) so that you can compare to prior speed tests. When testing speed, remember that it isn’t all about you. You also want to test the speeds of key competitor’s websites to make sure you are faster than everybody else in your industry.
After you’ve run your speed tests, ask yourself “where do I have room to improve?” Using a tool like Web Page Test, you can see all your key speed metrics, including the Core Web Vitals metrics.
For total load time, it is helpful to review the waterfall view under details. The waterfall view lists all the elements that loaded on your website. The longer the line in the waterfall, the bigger that item’s impact on speed. Those long line items are where you need to focus to get your website loading faster.
For Core Web Vitals, you should review the scores in the summary table. Green means your page is good, yellow means your page needs improvement and red means your page is poor.
If you work to improve the speed on a few pages each month or quarter, your website will steadily speed up.
PRO TIP: Test the speed of all pages on desktop and mobile devices. There is a big difference in load time between 3G, 4G, Cable, and DSL connections.
Task #5: Clean up Canonicals, Internal Links, and Redirects
It is important that your website sends consistent signals to Googlebot (and to your human visitors too). Part of that consistency comes from making sure that all the links Google sees when crawling through your website tell the same story. There are three main areas you want to make sure are consistent:
- Internal links – if the navigation link to your about page is /about-us/, but the actual about page URL is /about/, this sends mixed signals. It doesn’t matter if /about-us/ redirects to /about/ because you are still telling Google that there are two different URLs present, causing potential confusion. So, each quarter, check your navigation and other internal links to make sure all the links are accurately reflecting your website.
- Canonicals – a canonical tag tells Google what the official URL for a page is. If your canonical says the official URL is /my-product-gallery/ but all the internal links point to /our-products/, that mismatch can cause confusion that can hold back your SEO performance. Each quarter, double check that the canonical tags contain the correct URLs.
- Redirects – if you are routinely fixing 404s, chances are you’ll have a lot of redirects on your website. There is nothing wrong with having lots of redirects, but it is a problem if redirects are incorrect. Let’s say you have a redirect from /contact-us/ to /contact/, but recently you changed from /contact/ to /contact-our-support-team/. In that case, you’d want to update the original redirect to go from /contact-us/ to /contact-our-support-team/.
PRO TIP: Check external links, social profiles, and local profile for inconsistencies in URLs. These external sources are just as (if not more) important in helping Google understand your website.
Taking Your Audit Further: 3 Questions to Keep In Mind When Completing a Quarterly Technical SEO Audit
Now that we’ve covered the main tasks, let’s talk through a few questions you should keep in mind while completing those tasks. Again, an audit shouldn’t be about blindly completing a checklist. Instead, an audit offers an opportunity to ask questions that help you explore your website. While these questions cover similar concepts as the tasks, these questions are trying to get at the bigger questions and themes behind those tasks.
Question #1: Is the website crawlable?
Most fundamentally, can bots access and crawl the website? When a bot moves through the website, can they discover all the pages contained on the website? Are bots seeing all the images? Are they using the navigation and internal links? Are you doing your best to make those images and text understandable and findable? Answering these questions requires looking at things like:
- The total number of pages Google has found on the website relative to the total volume of pages you actually have on your website, including understanding what pages or sections of the website bots may be missing.
- Determining if the XML sitemap is well structured, error-free, and includes the key pages. See task #3 above.
- Understanding the crawl patterns you see from Googlebot within your log files and identifying where and why problems may exist.
- Reviewing robot control statements to ensure this website is guiding bots correctly via noindex, nofollow, and disallow(and how Google is responding to those directives).
- Reviewing the HTML within each page to ensure text is readable by bots, links are followable, and images are understandable.
Question #2: Is the website indexable?
Assuming a bot can crawl the website, you next want to make sure that your pages all look as high quality as possible from a technical point of view. You may have great content somewhere on the website, but are you bogging that content down with so many technical mishaps that bots are discounting that content? This requires answering looking at things like:
- The number of errors that exist on the website, including not-found 404 errors (see task #1) and server or 5xx errors. Beyond knowing that errors exist, you want to know why these errors are occurring and how big of a problem they present (or don’t present, as the case may be).
- Are there lots of links (internal or external) pointing to redirects? Are there redirect loops or chains present? How is this affecting search performance or bot crawl activity—is it? See task #5 above.
- The website’s speed—including the speed on key pages. More than understanding if the website meets the core web vitals, what is the overall experience of loading this website on different connection speeds? See task #6.
- How many pages on the website are junk content—duplicate content, thin content, doorway pages, etc.—and how many of those pages is Google finding?
- Does the website comply with mobile-friendliness guidelines? Beyond that, what is the overall user experience on mobile devices—how can it be fixed?
Question #3: Is the website ranking & how is it ranking?
Now that Google has crawled the website and indexed it, you need to review how the website is performing within the search results? Is there anything that can be done technically to help it perform better? Answering these questions requires looking at things like:
- Are title tags and descriptions being used by Google in search results? If not, why not? If Google isn’t using the meta description tags written by the website (as is often the case these days), what content is being pulled into the search result?
- Is schema being used to mark up the content? If so, is it being pulled displayed in Google’s search results?
- What else is being surfaced in the search results and is all that information accurate and appropriate to include? That includes fraggles, images, videos, links in the knowledge panel, and more.
- Are nosnippet or max-snippet tags used on the website? If so, are they being respected by Google? Are these tags used appropriately or to the website’s detriment?
- How do competitors compare technically? Is there something that makes a competitor’s site better technically than the website you are auditing? It can be worth auditing what you can externally on a competitor’s website.
Technical SEO is complex, but it doesn’t have to be challenging. By routinely performing technical SEO website audits and improving the technical structure of your website, you’ll help avoid major complications and reduce your chances of being penalized. If you need help with improving, optimizing, or auditing your technical SEO performance, please contact me.
Learn More About Technical SEO
Remember that technical SEO is the ongoing process of optimizing a website’s code and server configuration to better communicate with search engine robots. The goal is to help the website earn higher rankings in organic search results and drive more visitors to the website from those results. Consider reading my book, Tech SEO Guide 2.0, to learn more about what technical SEO is and to address technical issues on your website.