After crawling a website, search engine robots have to process and evaluate every page and file found during that crawl. This is called indexing. You need to make sure robots correctly understand and are able to process all of the pages and files contained on your website. If robots do not understand a page or cannot process a page correctly, then the page will not be able to rank in search results. The resources below will help you find and fix any issues preventing robots from indexing your website.