Search Engines work and perform several ways in order to deliver and get proper search results:
It is the process where crawler comes to our website and fetch all the web pages linked to a website. This task is performed by a software, called a crawler or a spider( or a Googlebot, as is the case with google )
It is the process of creating an index for all the fetched web pages to read and adding them into a huge database of google where they can be later retrieved and indexed. Essentially, the process of indexing is identifying the word searches and expressions of users that describe the page and assigning the page to particular keywords.
When a request comes, the search engine prcoesses it. It will compare the search string in the search request with the indexed pages in the database.
Google calculates and analyzes the website pages. Since it is likely that there is more than one page contains the search strings, so the search engine starts calculating the relevancy of each of the pages through page keywords stuffed in titles, headings, and body in its index to the search strings.
Retrieving Results :
The last step in search engine activities is retrieving the best-matched results for the users. Basically,it is nothing one who searches on google and simply gets the appropriate results shown in the browser.
Search engines such as Google, yahoo, bing, and Yandex often update their relevancy algorithms dozens of times per month. When you see changes in your rankings it is due to an algorithm up-gradation or something else outside of your control.
Although the basic principle of operation of all search engines is the same, the minor differences between their relevancy algorithms lead to major changes in results relevancy.