An exploit discovered by Mark Williams-Cook has revealed more than 2,000 properties Google uses to classify queries and websites, as well as specific classifications such as consensus scoring and query types.
Why we care. The findings of this exploit give us even more insights into how Google search works. Earlier this year, we learned a lot from the huge Content API Warehouse leak. Now we’ve gained additional insights into scoring, classification, site quality scores, and more.
Consensus scoring. Google counts the number of passages in content that agree with, contradict, or remain neutral to the “general consensus.” Google will generate a consensus score, which likely impacts whether you rank on a specific query – especially debunking queries (e.g., [Is the earth flat?]).
Query classifications. Google categorized nearly all of the queries into eight “refined query semantic classes”:
- Short fact
- Bool (short for Boolean – yes/no questions)
- Other
- Instruction
- Definition
- Reason
- Comparison
- Consequence (Your Money Your Life, or YMYL)
These classifications determine how Google adjusts its algorithm for specific query types. We have known since 2019 that Google was using different ranking weights for YMYL-type queries.
Site quality scores. Google’s results are influenced by site quality scores according to Williams-Cook. Google also has a predicting site quality score patent. Quality scores appear to be calculated on a subdomain level, based on:
- Brand visibility (e.g., branded searches, or searches that include the brand’s name).
- User interactions (e.g., clicks, including when the site doesn’t rank in Position 1.
- Anchor text relevance around the web.
Sites that don’t reach a certain threshold (e.g., 0.4 on a 0-1 scale) are ineligible for search features, (e.g., featured snippets, People Also Ask).
Click probability. Google doesn’t use click-through rate directly in ranking. However, it appeared Google used a “click probability” for every organic result.
- “And so it appears that Google does factor in how likely it thinks someone is going to be to click on your result. This would change if we modify the page title. They have a tool that can give you hints about this in the Google Ads Planner, because it will tell you an estimated click-through rate there,” Williams-Cook said.
About the data. Williams-Cook and his team analyzed 2 terabytes of data and more than 90 million queries. Google paid his team $13,337 for discovering the exploit.
The video. This is a must-watch for SEOs. Improving your SEO with conceptual models – Mark Williams-Cook on YouTube.
Dig deeper:
#Exploit #reveals #Google #ranks #content