Blog

Unlocking Google’s Secrets: Exploring New Evidence on How SEO Works

Understanding how Google works is key to SEO success. SEO experts often rely on published info and a trial-and-error approach to refine their strategies. However, the search giant has historically revealed only limited information about its algorithms. Yet, new insights about Google’s search algorithm and operations emerge regularly, redefining the rules of the SEO game. 

Recently, more in-depth information about Google’s inner workings has surfaced. It indicates that Google’s algorithm might prioritize different factors and use more varied metrics than previously known. 

This article delves into the latest revelations about SEO and how it works, suggesting user engagement and on-site behavior are increasingly crucial. It explores how these elements are becoming pivotal in optimizing websites for Google’s organic search.

Why Google?

When discussing SEO, the conversation centers around Google algorithms. Why Google? Because it’s the giant in the room collecting everyone’s attention, with most people not even noticing its competitors. In December 2023, Google dominated the world market with 91.61% of the search market share.

In the United States, the world’s most diverse and competitive market, Google demonstrates a little bit lower dynamics.

Still, the number of users employing its search system is more than 10 times bigger than its closest competition, Bing. 

 

The remarkable popularity of Google search is due to two reasons: Google marketing strategies and the sole Google search architecture optimal for all users. We’re not going to get into details on the first factor: how exactly Google got to hold the remarkable 90%+ of the market share. Just remember that Chrome is the most popular Internet browser, while Google is its default search tool, and YouTube, the universally used video storage platform, is Google-owned. 

However, taking apart algorithms used by Google is essential for SEO.

Understanding How Google Works

How Do Google Algorithms Work?

Google’s algorithms are sets of rules and processes that connect users with websites. As a whole, this complex system is designed to deliver the most relevant and useful results. 

Delving into the Google algorithms’ mechanics, Google stores information about all websites it knows in the index. The index is updated with the help of Google crawlers ‒ software bits that visit websites to collect the data needed for indexing and retrieving them in response to relevant user queries. 

Source: Google

Now, when the user enters a query, the Google engine matches the keywords with the information it acquired from websites, and after it finds relevant websites (a fraction of a second, provided you have a good Internet connection) it presents them on the screen. Moreover, the results are ordered so that the more helpful sources appear closer to the top of the list.

What Is Google’s Ranking Algorithm?

Knowing how exactly the matching and ranking processes work is the holy grail of SEO. Simply having the relevant information on a given topic doesn’t guarantee your site will pop up on a SERP, as there are so many other variables involved, like the freshness of the content, technical site characteristics, availability of certain content elements, and others. 

From the documentation revealed by Google to the public, we’re not sure whether we’re aware of all the factors that impact ranking, and we know even less about how these different factors interact or complement each other to push the sites to the top. 

At the same time ‒ why would Google want to reveal all its secrets? To make everyone qualify for the top ranks so only the sites with more history or the ones that use Google Ads are visible? Or perhaps there’s something sneaky going on?

In reality, SEO specialists collect bits of official information and combine them with analytical data to piece together the Google puzzle. Meanwhile, Google changes rules regularly, making the puzzle-solving ever so interesting. 

However, in the last month, certain information about Google’s algorithm for ranking web pages surfaced that was not intended to be seen by those most interested in seeing them (professional SEO consultants). The following chapters take apart Google processes, analyze the new evidence, and discuss how it will influence SEO strategies.

Google Algorithm Technologies

Google’s core algorithm includes various components that help sort out search results. Some of these systems appear in the official documentation and Google openly discusses their significance. Others are rarely or never discussed, and the public learned about them from sources other than the search giant’s communique. 

Google Machine Learning Algorithms

 

Google introduced RankBrain in 2015 as their AI technology for enhanced understanding of user queries. Through machine learning, it continuously improves to better process inputs and solve ambiguity issues. The same year it launched, it already accounted for 15% of search results

Google may use RankBrain for all queries, making it an essential factor for all results. After selecting the results from the index, this AI tool may also influence the site rankings as it examines the top 20 or 30 websites.

RankEmbed BERT, or simply BERT, is a natural language processing technology developed and introduced by Google in 2019. It allows for a better understanding of queries, mainly in the context. While previous models analyzed each word in a query separately, BERT takes a keyword phrase as a whole and offers users more precise results. For example, if you typed “2019 Brazil traveler to the USA need a visa,” the search engine without BERT wouldn’t know if you were traveling from the USA to Brazil or vice versa. Now, the search understands the context provided by the syntax. 

Source: Google

DeepRank is one of the latest additions to the family of Google AI-run algorithms. It is an application of BERT’s capabilities for ranking. DeepRank is also trained on a large dataset to provide it with an understanding of common-sense language. 

MUM is a further advancement in the NLP technology application for matching queries with search results. Introduced in 2021, it’s considered to be “1,000 times more powerful than BERT” due to its multimodality reflected in multitasking and understanding 75 languages and different content formats. 

Here’s an example of how MUM can understand a search intent and context of the query:

Source: Google

Naturally, it requires using more resources than BERT and is thus used in selected cases only, such as for queries related to COVID-19. 

Google Search Algorithms Based on User Behavior

 

NavBoost is a core Google search algorithm that has been gaining public attention recently. It appears that this model’s goal is to deliver higher-quality results to users by learning user behavior. NavBoost accomplishes this by collecting click data, including click-through rates and hovering and incorporating data acquired from human evaluators. Interestingly, Google stores information about every query made within the last 13 months and matches it with the results of human-made quality tests to create Google search algorithms.

By learning about user reactions through clicks, Google can compare which websites are more relevant for user queries and refine the results. Additionally, since NavBoot saves click data only for the last 13 months, it effectively implements an anti-grandparing policy.

NavBoost divides its data into sets, or “slices’, by different factors. The ones we know of include desktop/mobile and localization. That essentially means Google may show different results to a query based on the user’s device and location. For example, when searching for pizza places, Google might provide results based on users’ click history for the same query in that area. Similarly, when typing the bank’s name, Google may show results for local branches or online banking, depending on whether the user is searching from a desktop or mobile device. 

Glue is the extension of NavBoost that deals with non-textual elements similarly. It stores information about users’ behavior when using image carousels, direct answers, maps, etc. 

Neither NavBoost nor Glue are involved in the initial selection of the websites from the query. Usually, sorting down to the tens of thousands of results is done without the AI-based tools that require extra time for processing. After the initial filtering, these two tools (but not only them) simmer down the results list to a few hundred and help to rank results and display them on a SERP. 

The framework that organizes the results on a page and is responsible for the interface is called Tangram (formerly Tetris). 

Google efficiently uses various sophisticated technologies at different stages of filtering query results and ranking. For example, ranking and comparing documents that match the query requires more time and energy consumption than the initial search, which involves exploring millions of indexed websites. That’s why the initial stage employs limited AI algorithms.

Google openly discusses and promotes top-notch AI technologies that help understand user intentions and find the most relevant results. However, it shies away from mentioning the tools that examine users’ behavior on the search pages, whether they are older systems like NavBoost or newer ones like Glue.

The software that analyzes users’ past interactions with search results, determining which results are more likely to be clicked for specific queries, significantly enhances the search engine. There are several reasons Google is lip-zipped about it. 

First, such an approach reduces the chances for new or less popular websites to appear at the top of the list, even with the relevant content. Second, these technologies provide more opportunities for Google to manipulate search results, even though I have no doubt they misuse these powers. This can lead to unwelcome speculation.

Finally, storing users’ click history will prompt the “big brother is watching you” kind of talk. Meanwhile, Google seeks to distance itself from platforms like Facebook, which generate content based on users’ personal information and browsing history.

RankBrain AI technology for understanding user queries, using machine learning to improve understanding and solve ambiguity issues.
BERT (RankEmbed BERT) Natural language processing technology for better contextual understanding of queries and analyzing keyword phrases as a whole.
DeepRank An application of BERT for ranking trained on a large number of documents to understand common-sense language.
MUM Advancement in NLP for matching queries with search results, understanding 75 languages and different content formats.
NavBoost Learns user behavior to deliver higher-quality results, collecting click data and using human-made tests to refine algorithms.
Glue Extension of NavBoost handling non-textual elements like image carousels, and maps based on user behavior.
Tangram (formerly Tetris) Organizes results on a page; responsible for the interface, part of the filtering and ranking process.

Factors Used by Google Search Algorithms

Google uses a five-step evaluation process to determine which sites should appear on your search screen. Let’s first examine the factors that Google deems crucial and then discuss how they are retrieved, based on the information about Google ranking algorithms. This data is detrimental to developing SEO best practices.

  1. Meaning: The most important factor for a certain result to appear in front of a user is its relevance to the user’s query. Google attempts to interpret the prompt’s meaning, corrects spelling errors, resolves ambiguities, activates a sophisticated synonyms system, and identifies a variety of websites that contain relative content.
  2. Relevance: If a webpage has the same keywords as your query, that’s a strong relevance signal. But Google’s approach is more nuanced. It also analyzes if the page’s content is relative to the query in other ways. So, a search for “dogs” won’t just return pages with repeated use of the word, but ones with related content like dog images, videos, or descriptions of dog breeds.
  3. Quality: Google prioritizes websites that show a high level of expertise, authoritativeness, and trustworthiness. The evaluation involves checking if reputable websites link to the content, suggesting reliability. Another factor is the availability of user reviews on the site. They also gather human-generated feedback from the search quality process to refine their assessment.
  4. Usability: When other factors are equal, Google compares website usability (in the real world, this factor is probably examined in all cases). Usability encompasses many factors, such as accessibility, mobile friendliness, navigation, load speed, etc.
  5. Context: In tailoring search results, Google considers context and settings, including your location and search history. For instance, searching “football” will show different sports based on whether you’re in Chicago or London; or, if you have searched “Malaga” and in the following query you type “vacation in Spain,” it’ll show you pages and pics of Malaga, among other results. This customization aims to make search results more relevant.

 

How Does Google Receive the Metrics?

We know the algorithms Google employs and the factors it evaluates to search, rank, and display results. The missing piece is the specific metrics Google uses to implement evaluation data and their sources. 

On-Page Content

 

I’ve already mentioned Google crawlers (or bots), virtual creatures that get on the sites to collect all necessary information for indexing. They regularly scrutinize textual content, images, videos, and metadata to assess a website’s current state. But not only that. With the help of crawlers, Google can evaluate not only your content but also gather data about website quality: does your site include a navigation bar, is it easily accessible for all people, and does it have broken links? 

Additionally, they identify whether your website includes links to other websites, possibly the ones with a good reputation. NPDigital analyzed 25,000 search terms and found that, on average, the first result on Google search has 3.27x more backlinks than positions #2 through #10.

 

Thus, whenever Google receives a relevant query, it uses data provided by the crawlers regarding the site’s content quality and relevance and website usability and trustworthiness. Optimizing on-page content, website structure, and accessibility is crucial for SEO professionals. It is also the one aspect of the Google evaluation that is entirely within the control of website owners, unlike the following two we’ll explore.

Quality Tests Conducted by Humans

 

A less-discussed aspect that nonetheless makes a huge impact on how Google operates is the regular search engine testing (referred to as quality tests) performed manually by human evaluators. Google reported performing 894,660 search quality tests in 2022.

One goal of these tests is to evaluate proposed changes to the Google search. The other one is to evaluate how existing Google search algorithms work by comparing search results to queries. Google states that these tests don’t directly impact actual website rankings, but “help us benchmark the quality of our results.” In my opinion, if they can influence how the results are displayed they can also change rankings. 

Google has created a 168-page document with general guidelines for human raters, discussing how to properly evaluate search results proposed by Google. Part of the document focuses on understanding user intentions and defining the relevance of the results. However, webpage quality receives exceptional attention in this paper. 

Users are instructed to evaluate each website’s experience, expertise, authoritativeness, and trust using a Page Quality scoring system from 1 to 10. They are encouraged to find reputation information about websites and their creators, check ‘About Us’ and ‘Contact Us’ pages, compare content engagement with other websites, and more.

Though website authoritativeness holds a special place in website evaluation, it is also noted that pages with a perfect reputation shouldn’t be ranked higher than less reputable ones if their content is less relevant to the user’s query.

Source: Search Quality Evaluator Guidelines

You shouldn’t worry that the rater is going to explore your site and ban it from Google’s top ten for not having a contacts section. That’s not how these tests work. 

Source: Search Quality Evaluator Guidelines

In reality, human evaluators employ multiple factors when they decide which results are better for user queries. Since they are instructed to prioritize pages with the highest quality scores, the Google search algorithm will also learn to prioritize such results, for which the website elements mentioned above are beneficial. 

Consider another piece of information to learn how Google works: all quality tests are conducted on mobile devices. 

In summary, quality tests are essential for fine-tuning Google ranking algorithms, which involve ranking high-quality content higher. They also can shift overall SEO priorities toward making websites appear more reputable.

Click Signals

 

The next data source is very important for Google search, yet it has never been explained to the public.

We already know that Google analyzes user behavior for providing a more personal experience, either when showing results for local businesses, or guessing the intentions of desktop users versus mobile ones. But how exactly does Google know what results are more relevant for a particular user group?

Now, when we combine the data from Google’s internal PDF presentations with Pandu Nayak’s testimonials (VP of Search at Google), we can have an “Of course!” moment and exclaim, “It’s the clicks!”

Source: U.S. Department of Justice, Trial Exhibit – UPX0228

As mentioned earlier, Google stores a history of all user clicks made in the past 13 months. Until recently, the only known user history that Google used to enhance its functionality was user search history, allowing it to offer relevant autocomplete suggestions and slightly influencing results rankings. However, with billions of click-and-hover records, Google has much greater capabilities than previously recognized. It would be a shame to waste that power, right?

Analyzing information about user impressions helps fine-tune Google’s processes. It functions much like human-conducted quality tests but on a larger scale. The search engine assesses which results are more popular for specific queries and adjusts its algorithms accordingly. 

If you’ve ever wondered why SEO suggests adding essential pages to your site, such as About Us, here’s the answer: Google observes a general preference for fully developed websites. Therefore, if other factors are equal, it ranks the ones with an About Us page higher. 

Click tracking can help Google determine sites’ bounce rates by calculating the time users spend on the site before returning to the search engine. This kind of data may indirectly influence Google’s assessment of the quality or relevance of web pages. Also, remember that websites using Google Analytics allow Google to gather data on user behavior. There were over 28 million such websites in October 2022 providing an additional impressive dataset for Google to use. 

However, Google recognizes that it is essential to examine other factors than click tracking to understand user behavior. In a 2016 paper, they argue that the three factors for evaluating SERPs are clicks, user attention, and satisfaction (hence, the CAS model). Instead of relying solely on clicks, this holistic approach incorporates mouse movements to learn about user attention and accounts for situations where users don’t click on the search results even after viewing relative items. 

Source: A.Chuklin, Maarten de Rijke – “Incorporating Clicks, Attention and Satisfaction into a Search Engine Result Page Evaluation Model”

The CAS model sheds light on how Google search ranking works and is particularly helpful for predicting user behavior on modern search pages with complex layouts and multiple content types. 

Google describes such a process when it accumulates information about user behavior to generate results lists as a ‘dialogue’. In a sense, Google’s search is a continuous communication between the search engine and its users: on one side, users receive relevant results, while on the other, Google reads how users respond through click tracking and other instruments. Consider that Google acquires billions of new pieces of data daily to adjust its matching techniques.

As an example of using this approach, one of the presentations describes how the search results containing “DVM” (Doctor of Veterinary Medicine) were under-ranked for queries starting with “Dr” (Doctor). The volume of click data suggested Google’s search algorithms that the “DVM” results are a subset of “Dr” ones.  

With all the benefits the user behavior history brings, there’s one concern it raises. Popular websites are more likely to appear at the top of the list, prompting users to click on them. Click data causes the websites to rank higher in the future, creating a loop that is impossible to enter for new websites with similar content. 

Understanding How SEO Works

SEO aims to learn how Google search engine algorithms work and optimize websites to rank higher on search results. This process mainly focuses on enhancing website content, performance, and structure.

By exploring Google’s documentation and overviewing the results of previous adjustments, SEO has developed several rules:

  • Keyword optimization: Ensuring content includes relevant keywords for search queries
  • Content quality: Creating valuable, informative, and original content
  • Mobile-friendliness: Designing websites to work seamlessly on mobile devices
  • Page speed: Ensuring fast loading times for better user experience
  • Backlinks: Building reputable links from other websites to establish authority
  • User experience: Creating a user-friendly website layout and navigation
  • Meta tags: Using title tags and meta descriptions effectively for better indexing
  • Regular updates: Keeping content fresh and updated

Google regularly updates its algorithms that the SEO specialists keep track of. Google search algorithm updates may be easily detected, such as adjusting the title tag of the meta description length of the results on a SERP. Others are not on the surface and are not easy to detect. By modifying its rules, Google prompts SEO to be a dynamic and competitive process.  

Reevaluating How SEO Works With the New Evidence From Google

The information about Google data structures and algorithms that has become available to the public in the last month or two allows for a better understanding of what’s going on under the search engine’s hood. It does not necessarily mean the rules should be changed, but rather that the focus will shift from how on-page SEO works to off-page SEO.

Well-known SEO experts have expressed an opinion that the general understanding of how Google works was wrong, and, subsequently, the SEO model must be changed. 

These are the groundbreaking news explaining how Google algorithms work: 

  • Google puts a lot more emphasis on user behavior than it was known before. The history of past interactions with the search results influences which results are retrieved from the index and how Google’s algorithm ranks them in a list. For Google, user impressions are the main factor for assessing the quality of websites. By learning which websites humans select using which queries, Google receives knowledge of what’s actually on the website.

Source: U.S. Department of Justice, Trial Exhibit – UPX0203

  • Page quality tests conducted by human raters play a significant role in fine-tuning all of Google’s search algorithms. They help Google understand which website qualities are crucial for evaluating its authoritativeness and spread that knowledge to the algorithms. Possibly, human evaluation and user behavior history have more significance than the backlinks now.
  • The “mobile first” rule is not just merely the figure of speech: the raters conduct quality tests only on mobile devices. For example, if you optimize your page meta description to look nice on a SERP (i.e., to be fully visible or include keywords in the visible part) ‒ judge by how it looks on a small-screen device.
  • Google assesses technical data about websites not only by crawling websites but also by evaluating user behavior on the site. It considers such factors as website performance, accessibility, and availability of essential elements significant for positive user experience and ranks websites accordingly.
  • NavBoost is Google’s not-so-secret weapon, storing information about user behavior for 13 months. That time frame should provide additional stimulus for webpage owners to create new content or update existing.

SEO Trends for 2024

Google is evolving and changing its algorithms impacting how SEO works for websites. However, there’s no need to worry that one day, your SEO efforts and all the investment in website visibility will go to waste. The core of Google’s purpose was and will be providing users with the most relevant and valuable information possible. 

SEO strategies focusing on content quality, website usability, and accessibility align perfectly with this goal. Always think of your website visitors, then optimize for the search engines. 

Considering Google’s attention to user behavior, website SEO will focus on the following aspects this year:

  1. Google’s algorithm can measure website performance and accessibility, so their significance for ranking on the search results lists can’t be underestimated.
  2. Websites with improved user engagement factors perform better than their organic competitors.
  3. Website owners should consider how their page’s snippets appear on mobile devices’ SERP.
  4. Website trustworthiness is the core SEO factor. Building a strong brand can lead to increased recognition and trust.
  5. Google has become better at understanding user intentions and showing personalized results.
  6. Websites should adopt a holistic approach and pay attention to all essential site elements.

 

Frequently Asked Questions

What are Google algorithms?

Google’s algorithms are complex software systems that interpret the meaning of users’ queries, retrieve the data about websites that contain relevant content from the Google search index, and display it on the search pages. 

What factors does Google’s search algorithm use to evaluate websites?

The factors that impact a site’s ranking among organic competitors on Google search pages include query meaning and context, website content quality and relevance, and website usability. An effective SEO strategy accounts for these factors assuring that a website aligns with Google’s algorithms.

How does Google search work with SEO?

Search engine optimization is a web development strategy that aims to improve website ranking in the search engines. Because Google is the most popular search engine, SEO focuses on aligning with Google’s ranking algorithms.

How many algorithms does Google use?

Google uses many algorithms for different aspects of search. The exact number isn’t known by the general public. These algorithms include RankBrain, BERT, NavBoost, Glue, Tangram, and MUM. 

How does local SEO work?

Understanding how local SEO works involves optimizing your website to appear in more relevant local searches. The process includes incorporating local keywords and ensuring your business is listed in local directories.

Phonexa

Phonexa is the leading all-in-one platform for call tracking, lead distribution, email, marketing, and digital marketing. The Phonexa staff is responsible for authorship of Phonexa blog posts.

Recent Posts

Call Recordings & FCC 1-to-1 Consent in 2025: Is It Legal to Record Phone Calls?

Disclaimer: The articles and contents of this website are provided for informational purposes only and…

4 days ago

2025 FCC Compliance Guide for Joint Ventures: Preserving the Synergy

Disclaimer: The articles and contents of this website are provided for informational purposes only and…

6 days ago

Phonexa Pay-As-You-Go: Flexibility, Transparency, and Full Control Over New Custom Products and Integrations

Phonexa’s new Pay-As-You-Go feature is designed for clients who need added support with custom product…

7 days ago

How To Achieve TCPA-Compliant Lead Generation Success in 2025

Disclaimer: The articles and contents of this website are provided for informational purposes only and…

2 weeks ago

FCC Regulations and Consent Management in 2025

Disclaimer: The articles and contents of this website are provided for informational purposes only and…

2 weeks ago

The Impact of the 2024 FCC Regulations on Your Business: A Practical Guide

With 2025 around the corner, businesses in lead generation and marketing are preparing for a…

2 weeks ago