Skip to content

Top 5 Questions to Ask Before to Avoid Destroying SEO in a Website Redesign

By Matthew Edgar · Last Updated: May 28, 2024

At some point, every website will need to be redeveloped. A redevelopment presents new opportunities to refresh the design, update features on the website, update the backend systems, revise the website structure, and more.

While there are plenty of benefits of redesigning a website, redesigning introduces major changes to the website. Those changes introduce the risk that the changes will not work. When that happens, the redesigned website will not perform as well as the current website. This can destroy the website’s SEO performance. For example, in the case of this website, the company lost about half of the organic traffic following the rollout of the redesign.

Graph showing traffic drop after a redesign

There are a lot of questions you need to carefully consider before starting a website redesign or redevelopment project. In this post, I’ll review the five most important questions I’d recommend asking your development team to ensure a successful redesign project.

1. Will URLs Change?

Search engines use URLs as the identifier for a page. As a result, every ranking signal associated with a page is connected via that identifier—the URL. If that URL changes during the redevelopment, then Google needs to update the identifiers in the database and associate each ranking signal to the new URL.

Ideally, no URLs will change during the redevelopment project. While we typically think of this as bigger changes (/shop-products changes to /buy-items), subtle changes to URLs can cause just as many problems. One common problem I see causing traffic drops following a redesign is the new URLs remove the trailing slash. For example, the URL changes from /buy/ to /buy. It doesn’t seem like that type of change should cause problems but to Google, a URL with a trailing slash is different than a URL without a trailing slash. This can also happen by removing (or adding) www in the URL’s domain (read more about selecting a website’s canonical domains).

To be clear: any URL format can work and rank well. URLs can have www or not. URLs can end in a trailing slash or not. The problem is that when URLs change, you force robots to process that change. While processing, your website can lose traffic.

Your goal, then, should be to avoid URL changes as much as possible. Keeping URLs the same can sometimes require extra development work but that extra work is often worth the time and money to avoid losing traffic.

If URLs must change, the URLs should be redirected. The redirects should return a 301 response status code. Also, the redirects need to be implemented when the new website launches. A delay in adding the redirects would result in Google thinking the old URLs were broken, which could result in a traffic drop. By adding the redirects at launch, you will communicate the URL change immediately to Google and Google can begin processing the redirect.

Keep in mind that even when you add redirects, you will typically see a drop in traffic. In my experience, I’ve seen this drop in traffic range from 5-20%. Most of the time, the dip is short-lived, and traffic comes back to normal levels relatively quickly. That isn’t always the case. I have seen plenty of sites that added redirects but never recovered the lost traffic due to URL changes.

Bottom line: Ask your developers if any URLs will need to change during the project and what steps can be taken to limit those changes. Any URL change, even with a redirect, can cause traffic drops.

2. Will page rendering change?

Rendering is the process of constructing the display of the webpage. To correctly understand the page’s content, Google’s robots must successfully render the page when crawling the website.

There are two types of rendering on websites: client-side rendering (CSR) and server-side rendering (SSR). With server-side rendering, the website’s code is fully processed by the server. The browser simply outputs what it receives from the server. With client-side rendering, some of the code is processed by the server but some of the website’s code relies on the browser processing JavaScript in the browser.

Google has very few problems with server-side rendering. It is relatively simple for Google’s bots to receive the code from the server. However, Google can have problems executing the JavaScript code involved in client-side rendering. Despite a lot of progress in executing JavaScript code, Google sometimes encounters errors that prevent the execution of the code. For example, there could be a bug in the code that prevents successful execution. It could also be that the JavaScript code may load too slowly or that the JavaScript code relies on events Google does not support. In other cases, the JavaScript files were accidentally blocked by the robots.txt file. That prevents Google from loading the JavaScript files during the crawl. Whatever the reason, any problems with JavaScript execution can prevent Google from seeing the website’s content.

The best practice is to make it as easy as possible for Google (and browsers) to render the webpage’s content. In particular, you want to make sure Google (and browsers) can easily render the main part of the webpage’s content. Learn more about rendering and how Google works with JavaScript.

Unfortunately, I’ve seen many websites lose traffic following a redesign because the new website complicated the rendering process. The redesign or redevelopment may have introduced new features and functionality that rely more heavily on JavaScript than the previous version of the website. In some cases, there is a period of adjustment as Google’s bots figure out how to work with the new website. In other cases, there are permanent problems with the JavaScript code that prevent Google from successfully crawling the new website.

On the other hand, I’ve seen positive gains in traffic when a redesign or redevelopment project simplifies how the website is rendered. When clients move away from client-side rendering toward server-side rendering, there are often increases in traffic because Google can more reliably crawl the website and fetch the content.

Bottom line: You need to ask your developer how the newly redeveloped website will render and if the rendering on the new website will be more complicated than it is on the current version of your website.

3. What will change about the site structure?

The site’s overall structure is communicated to Google’s bots in several ways, including the site’s navigation and internal links. Inevitably, the structure of the site will change during a redesign project. For example, the site’s navigation may be simplified and include fewer pages. Also, internal links may change as various features are added or removed. For example, a related posts widget might be taken off pages in the updated design.

At the simplest level, Google uses internal links to discover the pages contained on the website. Pages with fewer internal links (orphan pages) will be harder to find. If Google’s bots cannot find a page, then that page may not get crawled as often. Pages that are not crawled as often can fall out of the index or end up ranking lower.

More importantly, any changes to the site’s structure change Google’s understanding of the website. Google uses internal links, including the website’s main navigation, to assess the relative importance of the website’s page. Pages with more internal links are likely to be more important. I’ve seen many pages lose traffic when internal links to those pages are removed or reduced.

Internal links also communicate the relationship between pages. Throughout this blog post, I’ve included several internal links to other posts on my website that discuss related concepts. Hopefully, this helps you find those pages and learn more about related subjects. Those internal links also help Google understand that this post is connected to other posts on my website. If internal links are removed or reduced during the redesign project, that breaks the relationship between pages.

You also want to make sure internal links are not added to irrelevant pages or less important pages on your website. Following a website redesign or redevelopment, I’ve seen websites that lost traffic on important pages but gained traffic on unimportant pages. Those unimportant pages ended up with more internal links either due to a bug in the code that surfaced those pages more or because the developers misunderstood what pages were important.

Bottom line: Understand your current site structure before the redevelopment and make sure the site structure on the redeveloped website communicates similar information about the site structure. Ask your developer what changes will be made and make sure you understand the consequences of those changes.

4. How will schema change?

Schema, or structured data markup, provides more information to search engines about the nature of the content on a page. For example, schema code can be used on an ecommerce site’s product reviews to help search engines easily detect the overall rating of that product.

When used, search engines can use schema to enhance search results. For example, the product review schema adds a start rating to search results. Other schema may add author names or information about a recipe to the search result listing. Because schema enhances search result listings, that can improve the click-through-rate from search results, resulting in more traffic.

During a redevelopment, the code for each page will be significantly reworked across the entire website. This includes reworking the schema code associated with each page. I’ve seen a lot of websites accidentally remove the schema code contained on some pages. When the new website goes live, Google no longer sees the schema code so it removes the search result enhancements. That results in a traffic decline. Usually, this decline is more subtle because the page still ranks—just without the enhancement. This can be easily recovered when schema is added to the page.

Occasionally, though, I’ve seen more sizable declines in traffic when schema code breaks on the new version of the website. Those breaks in the schema code cause errors. These are typically associated with other problems on the page (see my previous comments about rendering issues). Sometimes, though, the errors prevent Google from including the page in special types of search results, causing the page to fall out of rankings completely. In rare cases, errors with schema can cause manual actions, taking larger sections of the site out of search results.

Bottom line: Before the redevelopment, be mindful of what schema code is in use and currently helping your organic search performance. Ask your developer what schema code will be used on the new website and how they will migrate schema to the new website.

5. How does speed compare and how will speed be measured?

A website redevelopment involves changing almost every aspect of how the website is built. The content management system powering the backend of the website might change. The HTML, CSS and JavaScript code creating the frontend of the website will also change. With all these changes, the speed of the website will also change.

For SEO, you want to know how these changes affect Google’s Core Web Vitals metrics. These metrics can have an impact on rankings, especially in competitive search results. These metrics also reflect key aspects of how users experience the page. For example, one of the Core Web Vitals metrics is Cumulative Layout Shift (CLS) and it measures visual stability. If the redeveloped website is less visually stable, with elements moving around the page, that could impact SEO but it will also make it harder for people to use your new website.

Along with Core Web Vitals, you need to also understand how the updated website will impact other speed metrics. Core Web Vitals describes only one aspect of website speed. You also want to consider how the redeveloped website will affect all other aspects of the website’s speed—from the initial connection, to how quickly elements are painted, to how quickly the website is ready for interaction, and more. My book Speed Metrics Guide goes through all these metrics in detail, discussing what each metric represents and how to measure each metric.

Each of these metrics can impact SEO in indirect ways and will also impact user engagement (and user conversions). I’ve seen many redeveloped websites lose organic rankings and traffic because it takes longer for Google to crawl the new website. Unable to crawl pages, Google pulls pages out of the index. As well, I’ve seen many redeveloped websites see a decrease in their conversion rates because metrics like Time to First Byte or First Contentful Paint were worsened during the redevelopment.

Because there are so many metrics, it is important you and your development team agree on what speed metrics will be used to evaluate the redeveloped website and gauge success. I’ve seen many instances where the UX or SEO teams measured website speed differently than the development team. Because of the differences in measurement, the UX and SEO teams thought there was a problem with website speed while the development team did not (or vice versa).

Bottom line: Ask your developer how they will measure website speed and what they will do to make sure the redeveloped website will load as quickly as possible.

Final Thoughts

A website redesign presents a lot of opportunities for improved performance. However, it also can present problems if the redesign or redevelopment is not successful. There are more questions beyond these five to make the redevelopment successful, including image transfer, HTML structure, server configurations, and more. Contact me if you would like help making sure your website redevelopment project will not worsen your website’s SEO performance.

You may also like

How to Respond to a Manual Action?

How do you respond to a manual action? Matthew outlines the steps you need to take to resolve the manual action and submit a correct reconsideration request.

What Is Information Seeking?

At the most basic level, generative AI and websites are both tools that help people seek information. How do we describe information seeking and based on that description, how do we compare generative AI tools and websites?