Skip to content

Indexing

After crawling a website, search engine robots have to process and evaluate every page and file found during that crawl. This is called indexing. You need to make sure robots correctly understand and are able to process all of the pages and files contained on your website. If robots do not understand a page or cannot process a page correctly, then the page will not be able to rank in search results. The resources below will help you find and fix any issues preventing robots from indexing your website.

Crafting Delightful Broken Experiences

Crafting Delightful Broken Experiences

Like it or not, technology isn’t perfect. We need to create friendly and helpful errors that give people who reach an error the ability to recover from that error.