Skip to content

SEO Discoverability

By Matthew Edgar · Last Updated: April 21, 2023

Before you can do anything else with SEO, Google and Bing need to find your pages. This process of finding pages is called “discoverability”, during which a bot needs to find all the various pages contained on your website. There are a few different methods you can use to encourage bots to find all the pages—and a few methods that will prevent bots from finding pages.

Internal Links: The Importance of Hyperlinks

The hyperlink is the most basic means of helping robots find the pages that exist on your website. These links are included in the href attribute contained in the standard HTML anchor tag (<a>). In this example, the HREF attribute references Elementive’s home page (https://www.elementive.com). The anchor text is “Elementive” and describes what page is being linked to.

<a href="https://www.elementive.com">Elementive</a>

This tag + attribute combination was part of the original HTML specification. All these years later, this tag is still the main way that bots (and humans) discover the pages on a website. When robots find this link within a web page’s HTML, they extract the URL within the HREF attribute and add that URL to their database. New URLs found within an HREF attribute will be added to a crawl queue.

To increase the discoverability of your website’s pages, the first and most basic step is adding more links to the pages on your website. Links from one page of your website to another page of your website are referred to as internal links. The more internal links there are referencing a page on your website, the better the chances robots will discover the pages being linked to.

Be careful to not put internal links just anywhere. You want to make sure internal links are relevant for users. Search robots are sophisticated enough to determine which links are being placed just for their benefit.

Also, be mindful of where links are placed on the website. Putting links in the navigation or the footer won’t help discoverability nearly as much as putting links within the main content of your pages.

Client-Side Links

Instead of being added directly to the HTML, links can also be added via JavaScript. This is a common approach for adding links after the page has loaded, possibly in response to a visitor’s actions—for example, after somebody clicks to learn more about one topic more content might load and that content might contain links. Read my article about how JavaScript can affect SEO performance for more points to consider about client-side and server-side code.

Links added this way aren’t as reliable a source of discovery for search robots because robots may not always execute the JavaScript successfully. As well, robots are unable to interact with the page and won’t see any links that are added in response to an interaction. As much as possible, you want to get the link into your HTML within the server-side code and not added via JavaScript, especially for links leading to important pages on your website.

Qualifiers

You can also add a rel attribute to hyperlinks. This allows for greater control in explaining what the link does. These types of qualifiers allow Google’s bots to better understand your links and the relationship between the pages being linked from and to. With that information, Google can handle the links appropriately after discovering them. With the attribute rel=”sponsored”, you are communicating that the link in question has some type of monetary relationship attached. With rel=”ugc”, you are communicating the link in question was added by users and not the site owners.

OnClick Links in Buttons or Other Tags

With JavaScript, you can also make other attributes on your website clickable, such as a <p> (paragraph) tag or a <button> tag, by adding an onclick attribute to that element. The onclick attribute is best reserved for other non-linking functionality, such as highlighting a paragraph after clicking on it or using a button to load a modal/pop-up window instead.
However, you can use the onclick attribute to create a link. By adding an onclick attribute to HTML elements like <p> or <button> (or other tags), you can make any element a link. For example, this button would take somebody to Elementive’s home page:

<button onclick="window.location.href = 'https://www.elementive.com';">Elementive</button>

Unfortunately, this does not work as effectively as a link because robots do not trigger the onclick event. If a button is your only link pointing to a page on your website, that page will have trouble being discovered.

If you do use or have to use the button onclick (or a p onclick, etc.) for some links on your website, make sure you also have regular hyperlinks with an HREF attribute linking to those pages as well.

Alternatively, consider restyling a link to look like it is a button. One route is you can style a traditional link like a button using CSS. Or, you can wrap the <button> in an <a>:

<a href="https://www.elementive.com"><button>Elementive</button></a>

XML Sitemap

The final method we’ll discuss here is that you can support page discoverability with the XML sitemap. An XML sitemap is a list of all pages contained on your website that you want Google to find and crawl. While beneficial for nearly every website, XML sitemaps are a particularly helpful solution for websites that can’t easily add internal links to every page. Maybe there are too many pages to link to or maybe there are technical constraints prohibiting the inclusion of links within the website’s main content. The XML sitemap ensures Google’s bots are able to see all the pages.

When it comes to the XML sitemap, you want to make sure the pages listed in the sitemap are an accurate portrayal of your website’s pages. You don’t want to list error pages, duplicate pages, or old URLs that redirect elsewhere since those pages aren’t ones you want Google to discover. In the same way, you don’t want to list pages that you’ve disallowed or noindexed because those aren’t pages that Google needs to discover.

Check If Google Has Discovered A New Page

The best way to tell if Google has discovered a URL is by inspecting the URL in Google Search Console. The inspection result will tell you if Google has not already discovered the URL. From the inspection page, you can request indexing on a URL to encourage Google to crawl the page. For more details, check out my video about using the URL inspection tool.

How Many Pages Is Google Discovering?

You can also monitor discoverability in Google Search Console using Google’s Crawl Stats report. In the “By purpose” table included in the crawl stats data, Google states how many crawls were made for the purpose of discovering new content (you can also see how many crawls were for the purpose of refreshing content). If you click on “Discovery” in this table, you will see a list of pages that were discovered in Google’s recent crawls of the website.

Final Thoughts

You want as many paths to discoverability as possible for the pages contained on your website. The more important the page, the more discoverability points are needed. Using traditional links and listing in an XML sitemap are the most reliable methods you can use to get bots to see what pages your website has to offer. You want to avoid making discoverability a challenge by using buttons or JavaScript-based methods for linking to pages.

If you have any questions, please contact me. Or, for more information about technical SEO, check out my new book, Tech SEO Guide.

You may also like

Breadcrumbs & Breadcrumb Schema

Do you need to use breadcrumb navigation on your website? If you do, should you mark it up with breadcrumb schema? In this article, we’ll discuss the best ways to approach those questions.

How To Fix 404 Errors On Your Website

How do you find the 404 errors on your website? Once found, how do you fix the 404 errors? What tools can help? Find out in this in-depth 404 guide!

Noindex vs. Nofollow vs. Disallow

Are you confused about the difference between noindex, nofollow and disallow commands? All three are powerful tools to use to improve a website’s organic search performance, but each has unique situations where they are appropriate to apply.