How Can We Figure Out a Few Of the Hundreds of SEO Ranking Factors Which Drive Mostly Positive Effects?

🤔🤔 So if there over 200 ranking factors, plus algorithm updates, how do you conduct a "fair" test of Search Engine Optimization (SEO) Signals?
IOW With so many variables, how do you create a 'control' site(s) vs a 'test' site?
Edit: just to be clear, I'm not talking about figuring out ranking factors. Most half-decent SEO users what those are at this point. I'm talking more about optimizing for those ranking factors and being able to make and prove hypothesis.
Meme credit:
9 👍🏽 2 🤭 1 🤯 13


how can we figure out a few of the hundreds of seo ranking factors which drive mostly positive effects
[filtered from 51 💬🗨]💬🗨

Steven Kang 👑 » Kyle is big on single variable tests. Matt Diggity is good with testing average value. I personally own 2000 sites I test with.

Thanks Steven Kang
With over 200 sites though, how do you isolate and test a single ranking factor?
Daniel » Meakin
At this stage of the game, it's not really profitable for guys with a lot of experience to be trying to confirm a factors existence. We reasonably know what those factors are. We also reasonably know what the most important ones are. SEO is decades old and you can mostly learn what the factors are from others.
What brings the ROI in our industry is and has always been optimizing for those factors.
Steven tests signals for example, which helps him identify what is weighted most importantly at any given time. E.g. At a basic level he can tell whether Authority or Relevance is currently most important. That's a lot more useful from an ROI perspective. As is live testing on a live website. Both are way harder than SVT – but are definitely more worth it for your average person because the reward pay off is so high.
You're 💯 right
I misspoke and used the wrong terminology, yes, optimizing for ranking factors is much more important for ROI than finding out what the ranking factors are.
So I guess my next question is: how do you go about testing the optimization of Authority and Relevance and deciding how to improve those? Is it a case of using lots of sites are picking out specific data points?
Daniel » Meakin
I'm not sure how Steven does it at such scale, but I'm guessing he's purposely leaving certain things out on each cluster of sites he has setup. That way if it performs well you have a better idea of why that might be. I'm not sure if he's ever spoken about it in detail, but I'm sure you could find out if you dig around the search function for an hour or so! Maybe even tags!
Steven Kang 👑 » Daniel Yes, I look for pattern clusters more so than individual single variable tests which are limited to small factors.

Easiest way is via ML AI. Just need a few thousand dollars for data assimilation and a few thousand to run the AI and create the model.
The challenge here is you need to also classify the type of site and the type of query to fully understand how the algos rank the Search Engine Result Pages (SERPs).

Thanks Jake
Do you have a link for that?
Not that is going to directly sate your query.
The closest example that comes to mind is the Multivariate Additive Noise Model…
Using AI to Understand Complex Cauzation – DZone AI
1. Assimilate (scrape) — query >> SERP >> sites
2. Classify (query type, SERP type, site type). Use onsite model to capture on-site ranking factors.
3. Augment (external SEO tools, apis)
4. Normalize
5. Analyze
6. Adjust and repeat
🤔 Interesting, thanks Jake that's very similar to the method I use currently.
On recommendations on on-site model? Using POP currently.
On my similiar project, I believe I had roughly 120 on-site factors derived from industry docs and first-hand knowledge.
From a dev Point of View (POV), my objective was to make it easy to add additional rules beyond xpath and css selectors.

Emoji's are the new ranking factor. 🙂


nothing in SEO can be isolated because rankings are proportionate to competition and you can't control their activity.

Meakin » George
So how can you ever know if you did something right / wrong?
George » Meakin
It's important to try and keep a close eye on competition. isolate the sites which share the most similar keywords and compete on a content level… then keep track of the changes they make (such as link aquisition) identify who is improving and try to use educated guess work as to what changes are being made and try to better them. Cutting corners doesn't usually yield long term stable results and with the SERPs so volatile due to regular updates, the best you can do is stick to a sensible plan and hope everything falls into place.
Thanks again George
So then the whole idea of testing your SEO theories and practices is a red herring? We're all just making educated (or less educated 😉) guesses!
George » Meakin
Educated guess work is what everyone can do. Nobody actually knows the exact formula except Google. All we can do is run with strategies that have worked for other campaigns and make adjustments as required. We get good results, see visable impact on most long term client campaigns, but as discussed nothing is in isolation so can't claim 100% responsibility.


The whole concept of "ranking factors" has been diminishing for the past several years because of algos that basically set aside the factors and re-rank based on factors other than links, etc.
Lists of 200 ranking factors have been a joke played on noob SEO users for years, imo.

Thanks Roger
Much respect for you and your work but I'm confused. You're saying the algorithms set aside factors and re-rank on other factors?
So then; they're still ranking based on factors, right?
I dunno, maybe I've misunderstood?
Roger » Meakin
I've written articles about this on my blog many years ago, and Sugarrae wrote about it a year later when it was discussed at SMX Seattle by a Googler.
This is called the Modification Engine. There's an indexing engine, a ranking engine and a modification engine (to name a few).
Most SEO users don't know about the Modification Engine. I don't know why, I've been talking about it for years and Sugarrae talked about what the Googler said as well.
This is from an article I wrote in 2016, that shows how ranking factors are set aside:
"Google is focused on ranking web pages that directly answer a user's search queries, not on ranking the pages with the most links.
That's why Google has introduced on-page factors such as natural language processing to understand web pages as well as on-page spam algorithms such as Panda to weed out highly linked web pages that are spam.
There are also algorithms that look at personalisation features such as your location in order to further rearrange the SERPs beyond what the Regular Ranking Algorithm would otherwise rank.
In fact, there are even algorithms that study click through rates (CTR) in order to determine what kinds of sites tend to satisfy users for search queries with multiple meanings."
What I meant by queries with multiple meanings is different search intent.
Those CTR algos are still being studied, where they will take the set of ranked pages, and promote a page from page 2 of the Search Engine Result Pages (SERPs) and place it on page 1 of the SERPs because their CTR modeling shows that it will satisfy users.
More from my article:
Google is arguably a popularity engine now. This is the direct result of Google focusing search results on satisfying users.
Ever do searches and feel that the results aren't expert enough, that the information is on a 101 level?
That's because for that particular query it's possible that more people are satisfied by 101 level answers than they are with an expert answer. "
Why 200+ Ranking Signals Matters Less
Here's an article I wrote in 2015 about the modification engine. When I learned about the modification engine that really started to help me put the pieces together about how "ranking factors" were becoming anachronistic.
Mueller recently said, about a sentiment analysis algo from ten years ago, that their systems are always changing. They do thousands of changes every year. So he kind of smiled when asked if they were still using an algo from ten years ago.
Google started talking about 200 ranking factors like around 2006? That's a long freaking time ago.
Ranking factors still exist. But it's not like in 2006 where you HAD to have keyword in heading tag to rank well. My latest article on Search Engine Journal (SEJ), Mueller says heading tag is a ranking factor. But he also says that it only helps Google understand a section of a page. The use of that Heading tag has changed from in the past, where it was necessary to have keywords in it.
Now, keywords don't have to be in the H1. Anyone who has analyzed top ranked sites for Heading structure has seen this.
Modification Engine and How it Impacts Rankings
This is the article where Mueller talks about sentiment analysis algo and affirms that it is unlikely that they're still using an algo from ten years ago in the exact same way as from ten years ago.
Google Explains Negative Reviews And Rankings – Search Engine Journal
And this is the article about heading tags I wrote last night.
Google: Heading Tags are a Strong Signal – Search Engine Journal
That's awesome! Thanks Roger.
I've heard of that modification engine and it tends to born out in my own experience.
I'll take a look at that article too, I'm guessing they've only introduced more (similar algos) with medic, May 4th update etc too, right?
Search is really complex. Boiling things down to 200 Ranking Factors is very very absurdly simplistic.
Gary Illyes tweeted something about the indexing system:
"Don't oversimplify search for it's not simple at all: thousands of interconnected systems working together to provide users high quality and relevant results. Throw a grain of sand in the machinery and we have an outage like yesterday."
Gary Illyes on Twitter
Roger » Meakin
Don't call it Medic. 🙂
Some people called it Medic because they don't understand the basics of how Google core updates don't target specific industries.
Some of the past updates in the past two years were really about understanding what queries really mean plus understanding what web pages really mean. Mixed in with other things.
Seems like some of the algos from this year MIGHT have a link related component, mixed in with other factors.
Illyes did mention within the past year that they were looking to make Penguin faster.
Meakin » Roger
Ah gotcha, yes, core algo updates impact different industries in different ways. I tend to use the term ' medic' as shorthand because I'm usually terrible with dates (just ask the Mrs!)
What makes you think this year's algos may have had a link related element? It makes sense that they would start relying less on links as they improve ranking based on Natural Language Processing (NLP) and search intent etc but I think I've only read one article which attributed May 4th to having a link element (heavily) involved.
Roger » Meakin
Canaries. Keep a watch on different Search Engine Result Pages (SERPs), who is ranking and try to understand the possible reasons for ranking. Those are canaries. So if things change you can review those. Some are in my SERPs, some are SERPs I just watch.
Plus take into account what Googlers say.
So when Illyes said that they were working on rolling out a faster Penguin, that was a clear signal that they were working on link related signals and that a link update was in the works.
Also, you have to consider how last year they changed how rel nofollow is used.
On September 2019, I asked a friend of mine who lives in Japan to ask Illyes why Google changed how they treated rel nofollow and Illyes confirmed that the reason they did it was to add more data to the link signal.
So Google continues to refine the link signal. Something Google has traditionally done. I like to think of it as minimizing the noise to get to the true signal.
Google Confirms: Changing Nofollow Was About Link Signal – Search Engine Journal


"So if there over 200 ranking factors, plus algo updates, how do you conduct a 'fair' test of SEO Signals?"
You can't. And like Roger says, "lists of 200 ranking factors have been a joke played on noob SEO users for years" – in my opinion, too.

Thanks Martinez
You always talk a lot of sense on these forums. Much appreciated.
So when people say "test, test, test" that's not strictly possible, right?
Roger » Meakin
Yes, it's not possible. The reason it's not possible is that people bring their biases into the testing and see what they expect to see.
Jeff wrote a great article about this.
Jeff has been in Search Engine Optimization (SEO) for years and years, doing highly sophisticated science based enterprise level SEO. He knows his stuff.
Here it is:
Do We Have the Math to Truly Decode Google's Algorithms?
And you might want to read this article that hopefully will help kill those idiotic correlation studies that keep popping up, which are 100% useless for SEO.
Those correlation studies are pure click bait and do not help you understand why Google ranks anything. They never have and have led to some really bad SEO practices in the past.
Why Google Ranking Correlation Studies are Unreliable – Search Engine Journal
Added those articles to my weekend readin too! Thanks again Roger.
I've been telling people for 10 years why it's not possible to reverse engineer Google's algorithm. It's not that people bring their biases into the equation – it's that the math doesn't exist to identify hundreds of different signals all working together.
It might become possible with a real quantum computer but I doubt even that would do the trick.
Well you can frame anything any way you like, but the truth is, SEO users have to a certain extent tested what the ranking factors are and have optimized for them, else we wouldn't know.
You could for example, take 1000 pages on your site, all similar structure, similar keywords (for mass page locations site), keyword Sv between 100-500 to account for general competitiveness. Now remove the main keyword from the main title tag for 500 of those pages, keep it on the other 500, reverse the test. The correlation in results if it turns out exactly as predicted would pretty much confirm causation.


You know doctors deal with millions of variables and yet they still have science behind them. Mechanics also have to deal with hundreds of possible variables when diagnosing a car.
Yes, it's not as hard of a science as say chemistry, but they still get that 'generally when this goes wrong this can be fixed by that'. It's why experts actually exist.
The problem is that many times those who appear as experts are simply those who know how to sell themselves the best and play the social game.
But that doesn't mean there aren't people who do real experiments and have deep understandings of the inner workings of this extremely complex beast. Back in the day I loved to read Eli's Blue Hat SEO blog. Steven Kang, Kyle, Matt Diggity, Slawski, Andrew and others all go beyond headlines and pontificating the same tired points and actually study the algorithm. I've learned from them and it has re-enforced what I experience and note.
I'd like to think I follow such a tradition of analytical, intelligent, data driven decision making. It's so contrasting to what professional regurgitates do.

Doctors don't use correlation analysis to identify pathogens. They take tissue blood and samples and examine them in a laboratory setting. They test things against them. They use x-rays, magnetic resonance imaging scans, ultrasounds, and many other tools.
The SEO community doesn't have anything like that. They make up lists of "ranking factors" and then run correlation analysis on scraped search results.
That's not going to reveal anything about how a complex system works. Without access to the search engine's algorithms and data you don't have blood and tissue samples and you can't design equipment to examine them.
Gorden » Martinez
You are correct that doctors do have the tool of deductive reasoning. And they are superior to how an SEO diagnoses things.
But even then that's many times only used as a basis. When I went to the doctor a few years back, he saw my swollen, puss filled tonsils, muttered 'xyz has been going around' and wrote a prescription for an antibiotic. That's multi point correlation inductive reasoning and it is perfectly valid.
Many times there's no time for deductive reasoning at all, or the cost benefit to using it is outweighed by the reliability of measurements and time for tests. Plus deductive is always narrow given how many variables exist in real life and how life is constantly changing.
So yes, we aren't as scientific as doctors, but let's not fetishize the level of science in anything other than pure math, chemistry, physics, etc. Most things get hazy in application and entire fields of study have been created based on inductive reasoning. For example psychology, though it too has some level of deductive reasoning, is mostly inductive. I should know, that's my original profession in my home country, and both my sister and her boyfriend are doctors.
SEO users can measure plenty of things, and we also have Google patents to apply inductive reasoning on. Plus, Google will always have the disadvantage of having to show it's results. Which can be deconstructed, theorized upon and tested.
That any individual SEO doesn't do this tedious work is up to them. But plenty of us do and we kinda don't appreciate being compared with people who basically just repeat whatever Google tells them.
The problem is the people who are best at SEO are generally not the best at selling it. So those who don't know sell and burn the reputation of the industry.
I will say I have huge respect for those unlike me who don't think analytically but are gods at selling. Sometimes knowledge only works to one's own detriment. It's complicated and nuanced, like most of life. 🙂
Martinez » Gorden
" That's multi point correlation inductive reasoning and it is perfectly valid."
Actually, that's just a doctor following Current Therapy, which is a carefully reviewed set of therapeutic practices that are distributed throughout the medical community. Doctors are not relying on correlation analysis to understand what a new disease is.
Further more, deductive reasoning works for Sherlock Holmes, who has read every book and secret scroll known to man. It doesn't work for search engine optimization, especially given that nearly everyone in the industry LACKS the education and work experience they would need to deduce what the algorithms are doing.
You can compensate for that a little bit by reading patent applications and research papers, but if you don't have the basic skills in math and computer science needed to interpret what they are doing, you're not going to get much from reading those documents.
SEO blogger attempts to explain how the search engines use vectors and machine learning algorithms are an absolute disaster. People don't know enough to explain these things without relying on massive quotations – and when they do try to boil the stuff down to something short and simple they get it wrong.
The SEO industry simply isn't qualified to reverse engineer complex systems. It's capable of comparing random experiments in practice to each other and settling on practices that appear to produce good results. But that's a far cry from what the medical community has done. Random experimentation doesn't lead to understanding. It helps a group of people find an easy path to follow, which works until the path hits a dead end.
And every time the online forums and social media platforms light up with people in a panic over lost rankings and traffic, you know another random pathway has come to its end.
Gorden » Martinez
Call it what you will. People have been 'stumbling on random pathways' for years using data driven decision analysis. I'll acknowledge the probability of survivorship bias, but I'd estimate that as a low probability given the consistency of rankings for certain individuals. It's like people who attribute success in any field to luck, sure, that might be so, but unlikely. I guess you can say I think probabilistically.
Also, we have a different understanding of the epistemic nature of knowledge and the concepts of 'inductive vs deductive reasoning'. Which is fine, I'm happy to agree to disagree.
Martinez » Gorden
According to your Linkedin profile, your education is in psychology. That's a very respectable field but my background is in computer science and data processing. And I spent decades working with and analyzing large data systems.
Most of the "data driven analysis" in the SEO industry is unscientific. People are teaching themselves and in many cases studying the videos, blogs, and forum presentations made by other self-taught experts with equal lack of formal training in large systems design, data analysis, algorithms, and a host of other things that go into the mix.
And companies like Google have invented whole new sub-disciplines. You're just not going to understand any of that with correlation analysis on scraped search results.
Gorden » Martinez
You are right, but also not getting it, I just see it differently.
All I need to understand is just enough to manipulate the rankings in a profitable manner. If it's consistent, I'm on top. That's it. In the big days of online poker I made a little money off it. There's risk and reward, and probability.
That applies for most of life for me. Some people see it differently and that's fine.
Martinez » Gorden
No, I get it. Everyone sees it differently.
People manipulate the rankings without any understanding of how the search engines work. You don't need to know what BERT is or when it's used or how it's used to know that if you do X and Y happens that maybe doing more of X will get you more of Y.
That's good enough until your brand of X runs out of Y. And then you hunt around for something else. That's not science. It's just randomness.
People should stop trying to paint it over with fancy terminology and accept it for what it is.
In a couple more months Google will release another big algorithm (they always do sometime in the late summer or early fall) and people will begin weeping and gnashing their teeth again. If they could really reverse engineer all this stuff the way they wish, Google would have shut down years ago.
If this industry would pull together and adopt some standards a lot of these myths about reverse engineering the algorithms would start fading into the woodwork. Reality will sink in when everyone begins to agree on a common lexicon.
Gorden » Martinez
You assume much about me. The fact that I studied psychology has no bearing on this conversation. Again, we have a different understanding of the epistemic nature of knowledge and the differences between inductive and deductive reasoning. You are right about agreeing on a common lexicon, but I'm using words according to their definitions, I'm not a native English speaker, so my understanding is very dictionary based. Maybe yours is a bit more colloquial.
Also, much of your though process is tinted by what I perceive as all or nothing thinking and vast generalizations which I just don't share in how I process information. We are very, very different people. But I do wish you all the best. Truly do, the world is beautiful and I'm about to disconnect now and enjoy it.


Leave a Reply

Your email address will not be published. Required fields are marked *