SEO Discoverability

November 17, 2020

Before you can do anything else with SEO, you need Google to find your pages. This process of finding pages is called “discoverability”. Discoverability doesn’t seem like it should be too difficult: a bot just needs to find all the various pages contained on your website. However, there are a few different methods you can use to encourage bots to find all the pages—and a few methods that will prevent bots from finding pages.

The Traditional Link

Let’s start with the best method: the hyperlink.

It seems so basic that the anchor (or <a>) tag + the href attribute (still) remains a key to successful SEO. This tag + attribute combination was part of the original HTML specification. All these years later, this tag is still the main way that bots (and humans) discover the pages on a website. All bots need to do is grab your page’s HTML, scan for any HREF attributes contained in an <a> tag, and extract the value from the HREF attribute to locate another page.

<a href="https://www.elementive.com">Elementive</a>

To increase the discoverability of your website’s pages, the first and most basic step is adding more links to the pages on your website. The more links you have on your website pointing to the pages on your website, the better the chances Google’s bots will be able to find those links and discover the pages being linked to. Of course, you don’t want to just put links in where they aren’t relevant—Google has long since figured out which links are relevant or not. That is, putting links in the navigation or the footer won’t help discoverability nearly as much as putting links within the main content of your pages.

Client-Side Links

Instead of being added directly to the HTML, links can also be added via JavaScript. This is referred to as adding links client-side within the browser instead of via the server-side code. This is a common approach for adding links after the page has loaded, possibly in response to a visitor’s actions—for example, after somebody clicks to learn more about one topic more content might load and that content might contain links.

Links added this way aren’t as reliable a source of discovery as for Google’s bots may not always execute the JavaScript successfully, especially if executing the JavaScript requires some type of visitor interaction that bots aren’t able to replicate. As much as possible, you want to get the link into your HTML within the server-side code and not added via JavaScript, especially for links leading to important pages on your website.

Qualifiers

Since the initial introduction of the anchor tag and the href attribute, the rel attribute was also introduced which allows for greater control in explaining what the link does. These types of qualifiers allow Google’s bots to better understand your links and the relationship between the pages being linked from and to. With that information, Google can handle the links appropriately after discovering them. For example, with rel=”sponsored”, you are communicating that the link in question has some type of monetary relationship attached or with rel=”ugc”, you are communicating the link in question was added by users and not the site owners.

OnClick Links in Buttons or Other Tags

With JavaScript, you can also make other attributes on your website clickable, such as a <p> (paragraph) tag or a <button> tag, by adding an onclick attribute to that element. The onclick attribute is best reserved for other non-linking functionality, such as highlighting a paragraph after clicking on it or using a button to load a modal/pop-up window instead.
However, you can use the onclick attribute to create a link. By adding an onclick attribute to HTML elements like <p> or <button> (or other tags), you can make any element a link. For example, this button would take somebody to Elementive’s home page:

<button onclick="window.location.href = 'https://www.elementive.com';">Elementive</button>

It seems like this should work just as effectively as a link—after all, the onclick attribute has been around for quite some time. However, these types of links are rarely followed by Google’s bots (if ever). If a button is your only link pointing to a page on your website, that page will have trouble being discovered. If you do use or have to use the button onclick (or a p onclick, etc.) for some links on your website, make sure you also have regular anchor tags with an href attribute linking to those pages as well.

Of course, you don’t have to do this workaround. One route is you can style a traditional link like a button using CSS. Or, you can wrap the <button> in an <a>:

<a href="https://www.elementive.com"><button>Elementive</button></a>

XML Sitemap

The final method we’ll discuss here that you can support page discoverability with the XML sitemap. An XML sitemap is a list of all pages contained on your website that you want Google to find and crawl. While beneficial for nearly every website, XML sitemaps are a particularly helpful solution for websites that can’t easily add internal links to every page. Maybe there are too many pages to link to or maybe there are technical constraints prohibiting the inclusion of links within the website’s main content. The XML sitemap ensures Google’s bots are able to see all the pages.

When it comes to the XML sitemap, you want to make sure the pages listed in the sitemap are an accurate portrayal of your website’s pages. You don’t want to list error pages, duplicate pages, or old URLs that redirect elsewhere since those pages aren’t ones you want Google discovering. In the same way, you don’t want to list pages that you’ve disallowed or noindexed because those aren’t pages that Google needs to discover.

Final Thoughts

You want as many paths to discoverability as possible for the pages contained on your website. The more important the page, the more discoverability points are needed. Using simple/traditional links and listing in an XML sitemap are the most reliable methods you can use to get bots to see what pages your website has to offer. You want to avoid making discoverability a challenge by using buttons or JavaScript-based methods for linking to pages.

If you have any questions, please contact me. For more information about technical SEO, check out our new Tech SEO Fundamentals course hosted at TabletWise.

You may also like

Using Log File Analysis To Improve Your SEO Performance

You have to understand how Google’s bots are understanding and crawling through your website. Learn what log files are and how you can start analyzing your website’s log files to improve your SEO performance.

How To Find & Fix Duplicated Content

Put simply: Duplicate content confuses both human and robot visitors. Let’s walk through how we deal with duplicated content: locating, evaluating, and resolving.

2020 SEO Plans & Common Misconceptions

As companies make their 2020 SEO plans, there are three big misconceptions that I see repeatedly that should be avoided when adjusting SEO tactics in 2020.