November 17, 2020
Before you can do anything else with SEO, you need Google to find your pages. This process of finding pages is called “discoverability”. Discoverability doesn’t seem like it should be too difficult: a bot just needs to find all the various pages contained on your website. However, there are a few different methods you can use to encourage bots to find all the pages—and a few methods that will prevent bots from finding pages.
The Traditional Link
Let’s start with the best method: the hyperlink.
It seems so basic that the anchor (or <a>) tag + the href attribute (still) remains a key to successful SEO. This tag + attribute combination was part of the original HTML specification. All these years later, this tag is still the main way that bots (and humans) discover the pages on a website. All bots need to do is grab your page’s HTML, scan for any HREF attributes contained in an <a> tag, and extract the value from the HREF attribute to locate another page.
To increase the discoverability of your website’s pages, the first and most basic step is adding more links to the pages on your website. The more links you have on your website pointing to the pages on your website, the better the chances Google’s bots will be able to find those links and discover the pages being linked to. Of course, you don’t want to just put links in where they aren’t relevant—Google has long since figured out which links are relevant or not. That is, putting links in the navigation or the footer won’t help discoverability nearly as much as putting links within the main content of your pages.
Since the initial introduction of the anchor tag and the href attribute, the rel attribute was also introduced which allows for greater control in explaining what the link does. These types of qualifiers allow Google’s bots to better understand your links and the relationship between the pages being linked from and to. With that information, Google can handle the links appropriately after discovering them. For example, with rel=”sponsored”, you are communicating that the link in question has some type of monetary relationship attached or with rel=”ugc”, you are communicating the link in question was added by users and not the site owners.
OnClick Links in Buttons or Other Tags
However, you can use the onclick attribute to create a link. By adding an onclick attribute to HTML elements like <p> or <button> (or other tags), you can make any element a link. For example, this button would take somebody to Elementive’s home page:
<button onclick="window.location.href = 'https://www.elementive.com';">Elementive</button>
It seems like this should work just as effectively as a link—after all, the onclick attribute has been around for quite some time. However, these types of links are rarely followed by Google’s bots (if ever). If a button is your only link pointing to a page on your website, that page will have trouble being discovered. If you do use or have to use the button onclick (or a p onclick, etc.) for some links on your website, make sure you also have regular anchor tags with an href attribute linking to those pages as well.
Of course, you don’t have to do this workaround. One route is you can style a traditional link like a button using CSS. Or, you can wrap the <button> in an <a>:
The final method we’ll discuss here that you can support page discoverability with the XML sitemap. An XML sitemap is a list of all pages contained on your website that you want Google to find and crawl. While beneficial for nearly every website, XML sitemaps are a particularly helpful solution for websites that can’t easily add internal links to every page. Maybe there are too many pages to link to or maybe there are technical constraints prohibiting the inclusion of links within the website’s main content. The XML sitemap ensures Google’s bots are able to see all the pages.
When it comes to the XML sitemap, you want to make sure the pages listed in the sitemap are an accurate portrayal of your website’s pages. You don’t want to list error pages, duplicate pages, or old URLs that redirect elsewhere since those pages aren’t ones you want Google discovering. In the same way, you don’t want to list pages that you’ve disallowed or noindexed because those aren’t pages that Google needs to discover.