What does a search engine do?

ByDr.-Ing. Erik Neitzel
What does a search engine do?

If you want to improve the rankings of your website for certain keywords, it helps to understand how a search engine thinks and operates. This is the only way to gain a deeper understanding of why a Google Ranking Check delivers the results that can be seen in a Ranking Checker Tool.

A modern search engine like Google essentially does three things:

  1. Crawling: Reading the source code of a website from head to toe within the limits of the respective robots.txt of the respective page
  2. Analysis: Understanding the crawled website based on the current technical state of the art of the respective search engine
  3. Indexing: The inclusion of the analyzed website in a directory that returns certain crawled and analyzed websites as search results for certain search query combinations – within the limits of the nofollow and noindex attributes of the respective pages

We will now take a look at how a search engine works and what that means in concrete terms for your ranking checks. Even crawling is not as trivial as one might assume. Let’s talk about this in more detail.

Crawling as a minimum requirement for analyzing a website

A very special type of user visits your website at regular intervals: the Googlebot. It’s a machine that navigates your website in a polite but targeted manner, collecting data that needs to be analyzed later. Although it will never challenge your web server with excessive simultaneous access, it still spends little time on your side – because other websites are already waiting for it.

It may seem trivial, but blocked crawling is one of the main reasons why supposedly SEO-optimized websites get poor rankings on Google. Google was then simply never able to visit the respective website. Why? Because you do not have a sitemap, an incorrect or incomplete sitemap, or you have denied the Googlebot access to certain areas of the website via robots.txt.

It helps to keep in mind what a search engine is trying to achieve. She would like to get the fullest possible understanding of the structure of the website. This does not only mean the text of the page, but also its structure (sitemap) and its appearance. Last but not least, this is of great importance for the test for mobile optimization. The appearance also includes, for example, style sheets and any scripts that influence the appearance.

In a modern content management system like WordPress, some of the files required for this are now in the folders of the installed plugins. So if you want to make the crawler’s stay more efficient, given the little time a crawler can spend on your own site, you quickly have a problem. The first and completely wrong step that many SEOs take is to deny the crawler access to plug-in and other important directories.

But let’s keep in mind that Google would also like to understand how the page is visually structured if you radically restrict the analysis requirements of your own website by excluding the Googlebot.

So please always make sure that there are no barriers to crawling your website. Free the robots.txt from superfluous disallow entries.

Analysis of a website in preparation for its indexing

Once a search engine has plowed through your website, its (completely hidden) core task is to analyze this page. Here, different levels come into play, which of course are also exactly the factors to be addressed if you want to improve your Google website ranking:

  1. Technical basis level: Does your website have a clean markup (HTML, CSS) and flawless executability (especially scripts) or is it partially incorrect? Any errors are compensated for to different degrees by different browsers and affect a visitor’s surfing experience.
  2. Semantic level: Can the search engine understand what your page is about? Is there a sitemap? Does your page contain a menu structure and / or categories that make the structure of your page understandable? Are pictures described in a meaningful way (alt attribute)? Are articles linked to one another in such a way that they offer further information and at the same time describe the respective sub-pages in more detail and put them into context? Are the articles exhaustive or just superficial? Is your content relevant to the topic of your page in general?
  3. Compliance level: Is the communication between the client and your web server encrypted? Are all mandatory pages such as imprint and data protection declaration on your page?
  4. Performance level: Does the page only load the resources that are required to render the page? Are these resources loading at an acceptable rate for the visitor?
  5. Mobile optimization: Does your site have a responsive-optimized version that accelerates loading for mobile devices and offers mobile users an ergonomically prepared view?
    and much more

These and other points are analyzed for each website. The evaluation created here is the basis for the assignment of a certain placement in relation to other pages that contain similar terms.

Rankings are never completely static, as Google, as part of the analysis of your website, (a) tests every keyword against the acceptance of visitors, and (b) continuously develops the analysis mechanism.

Indexing as a result of the analysis for a specific search term

Every website analyzed by a search engine such as Google contains a specific evaluation by the analysis algorithm for every criterion that was analyzed. Depending on how a user makes his search query, certain web pages are now returned from a large database for certain search terms.

This database contains the search term / web page combination and the order in which certain web pages are listed for this search term and its respective syntactic and semantic phrases. This is what we call the index of a website, for example the Google index.

Every index is constantly in motion. The reason that the rankings of your website for certain search terms fluctuate regularly is that Google regularly tests the acceptance behavior of your website for visitors. If your website is then preferred to visit and a visitor does not immediately switch to a different search result, the chances are good that your ranking will increase in the future.

However, the total number of visitors is usually kept constant initially. The search engine Google in particular has no interest in bringing a particularly large (or particularly few) users to your website overnight. However, she has a very strong interest in permanently ensuring that the visitors who match your search query can actually find you.

Ranking checks are the checking of the current position of a keyword in the search engine index at the time of the check.

The bottom line: Sustainable SEO is optimization for the visitor

What does that tell us and what does it mean for your SEO efforts? Sustainable search engine optimization is always an optimization for the visitor, not the machine. All algorithm adjustments that Google makes when analyzing and indexing your website will always revolve around the following questions:

“Does the referred visitor match your website? Will they find what they’re looking for?”

If you can develop and write a website that people can actually use very well and want to read very much, then you can be sure of solid and steadily growing placements of your website. In this way, our SERPBOT is at your side for your Google ranking checks with words and deeds.

Dr.-Ing. Erik Neitzel