Search engines such as Google and Bing use bots to crawl pages on the web, going from site to site, collecting information about those pages and putting them in an index.
Think of the index like a giant library where a librarian can pull up a book (or a web page) to help you find exactly what you’re looking for at the time.
Next, algorithms analyze pages in the index, taking into account hundreds of ranking factors or signals, to determine the order pages should appear in the search results for a given query.
In our library analogy, the librarian has read every single book in the library and can tell you exactly which one will have the answers to your questions.
Our SEO success factors can be considered proxies for aspects of the user experience.
It’s how search bots estimate exactly how well a website or web page can give the searcher what they’are searching