Michael Martinez 👑
Why do people think that links from "irrelevant, off-topic sites" are bad?
I know several of the admins have been around the industry as long as I have and they too have seen all sorts of link assessment algorithms. I will leave a comment below where I share what I think is the origin of this myth.
But let me say that regardless of how one defines a good or natural link, someone else will have tried to emulate that goodness or naturality to death. In Web marketing, every good thing can be poisoned by bad behavior.
My advice to people is to not assume that toxic linking practices make all links of a certain kind "bad". If they are naturally bestowed without incentive or intention to manipulate search engine rankings, then the links are most likely worth having
9 👍🏽1 🤔10 16 💬🗨
In my opinion, if there is a single root source of the myth that "you should only get links from sites like your own", it would be Jon Kleinberg's HITS algorithm. See the link below for a copy of his paper.
Kleinberg worked for IBM and he proposed this algorithm in the 1990s. At that time, his hypothesis (supported by some research) was that "hubs" and "experts" existed in many fields of knowledge on the Web.
Experts linked to (reliable, trustworthy) hubs and hubs linked to (reliable, trustworthy) experts in various topics. This is natural behavior because people tend to be aware of others in their fields of expertise, professions, hobbies, etc.
Whether that was a good way to judge the quality of Websites is debatable (and not a debate I want to participate in any more). But the Web proved to be more random than the HITS algorithm's model. I'm not saying the model was bad or useless. It just didn't describe the randomness of the Web.
Web marketers naturally seek ways to improve their search referral traffic, and when it became clear that search engines were looking at links, marketers began devising many schemes to increase their backlink profiles.
Some of those schemes involved radically random distributions of links that made it pretty obvious that link manipulation was in play. So I think as their sites were penalized by search engines, they felt stung by the practice of getting links from "irrelevant sites" – even though their sin was the practice of getting UNNATURAL links.
I'll end my comment there.
Why do YOU think people are afraid of links from irrelevant sites?
More evolved and recent than HITS:
It basically describes seed sites that are the "experts" in a specific niche and then measuring the distance from the seed to determine some value. (It's basically playing the Six Degrees Of Kevin Bacon game – but for web ranking).
I'm not sure that ever really worked as described, but in there – and in several other patents in recent years – we can tell that Google is still actively trying to work out how to detect relevancy in links and reward accordingly. We may not know how they do it, but I'd have to assume there are some methods of various levels of effectiveness in play.
Now, that doesn't explain the "why people are afraid of links from irrelevant sites" bit. It just explains why links from relevant sites (even at the site level) MIGHT have some improved value over ones from sites that aren't. There's no inferred negative effect from links from sites that are 32 degrees away from Kevin.
I think the FEAR bit comes in two forms:
1) Many link strategies that go against Google recommendations can and do often bring on penalties. If you have experienced this before, you might be a bit afraid of a penalty from this. (Well, that and you just don't know any better).
2) The Introduction of the Disavow Tool: Having Google provide a tool to get rid of crappy links to avoid penalties gets you looking at what might be a crappy link. If you think in black and white terms, and a relevant link is a good link, then a non relevant link must be a crappy one. Therefore, disavow or be doomed.
Ammon Johns 👑 » Truslow
Plus, even before that broke, we knew about TSPR (Topic Sensitive PageRank) as something that had been researched, tested, and published.
Then of course was Teoma, which specifically looked at links between sites that were relevant to the query, and factored those in. Teoma may not have been a lasting success commercially, but that doesn't diminish in any way the brilliance of the technology, nor Google's awareness and understanding of it.
Sidebar: The TSPR thread at Cre8 was one of my most memorable threads of all time there. Bill brought up the patent and a slew of us dug into it for a week figuring out implications. Some very big brains in search weighed in and contributed to that thread. It was the start of my desire to look more deeply into that kind of stuff. (And I'm fairly sure it was very early in Bill's rise to fame for finding and breaking down patents, too – probably not the first, but definitely before he had made a big name for himself in it).
Edward » Truslow
I mentioned TSPR the other day to a 'guru' and the reply was a deafening silence 😃 I just knew they had run off to Google it. I mentioned Hiltop and the same again.
Michael I also think there was confusion when in the Google guidelines they stated not to link to bad neighbourhoods (Which I have tested in the past and managed to neg SEO a test site)
I ran a bucket load of SENuke and GSA links to a test site and the impact was almost unmeasurable.
Once again it's the guru's selling shiny objects
Michael Martinez ✍️👑 » Edward
I agree about the "bad neighborhoods", but that's linking out (not linking in). And people also exaggerate the "bad neighborhoods" concept in their minds, too. Back in the day it was kind of clear where the bad neighborhoods were.
Now, people routinely buy links and ask if they should disavow the links they didn't buy.
Yes. TSPR was a short lived term, really. Other things came in shortly after that were already that plus 20 other things going on to achieve better results. But that was the first time (at least in my awareness) where we really sat down and not only considered that Google might be doing something like that – but also seeing how they might go about it.
As for bad neighborhoods, I can't recall having ever heard or seen anything that suggested a way to catch that beyond some sort of manual action at Google. It was always one of those – it'll work great until you get caught kinda things. And then you'd weigh the value of the gain for the short term for the compounding risk that someone might report you over time (especially if it was hurting their ability to rank). Networks of those links tended to be the ones that you'd hear about.
It's not really off topic sites – it's a much more focused level than that. It's not even really about an off topic page. It's about the content in and surrounding the link being somewhat relevant.
If you make and sell bumper stickers and some blog on a fishing expedition site talks about the jeep they went there in and then link to where they got the bumper sticker on the jeep – that's got relevance to the subject there – even though the bumper sticker, nor the jeep, nor fishing connect to each other in and grand relevance way. They are relevant in the context in which they were used.
Ammon Johns 👑
Like most Search Engine Optimization (SEO) myths, I have absolute certainty that it is an oversimplification and a misunderstanding.
It was never, ever true that links from off-topic pages didn't have value. But it *is* just as true that patents and papers from almost all of the IR community have stated that links between related, on-topic pages have HIGHER value.
Totally agree with this. It makes sense with the whole concept of links being like university paper citations – a citation from another expert in your field is worth more than one from an expert with no knowledge of the subject. How well it's executed in practice is another matter though 🤔
To carry on the research paper concept…
While it seems that a vast majority of the SEO field disagrees with me, I am absolutely convinced that outbound links from "my paper" to trusted and understood sources of data and information helps with ranking. A lot, in some cases.
Even if you aren't an expert in something (like you weren't when you started your research paper in school), citing your sources and having them be recognized foundations of solid information on that subject lends credibility to what you're saying about the topic. So now if you say something new based upon information from several sources – that new info has some credibility and can rank (so long as it's not being drowned out by a lot of people coming to the opposite conclusion from the same sources).
Ammon Johns 👑 » Truslow
Aside from the Hubs and Authorities papers, the value of Outbound links was never really discussed, other than in the immensely old outright statement by Google way back when: "What links to you can help but never harm you. What you link to can harm but never help you" – Matt Cutts, around the time of the first Bad Neighbourhood penalties.
This statement, that outbound links cannot help you, really threw me for a long time. Because I'd run experiments, many of them, and had consistently found that good, useful, outbound links would indeed help a page.
I don't know if Matt Cutts said this only in connection to penalties, or if he was knowingly being rather less than fully honest, perhaps meaning that if people added outbound links purely to attempt to rank better it wouldn't work, but I genuinely have to conclude his statement wasn't entirely true, either way.
Sometimes the links ARE the content. I'll bet every single person in the group has an experience where a well curated and described set of links to further reading, or potential tools, or a specific service type in your region, was immensely valuable.
We need those hubs.
Of course, it is possible that Google may not have always rewarded the links themselves (though I find it a tiny bit unlikely), and merely assumed that if a page of links was good content, then people would link to it, so needed no extra recognition.
But in the years since, where so many on-page quality factors apply to content, I find it somewhat unthinkable that there wouldn't be a page quality metric that looked at the value of curated citations as part of the content value of the page.
Truslow » Ammon Johns
Agreed. I've always sort of thought about it this way too…
An inbound link gives a clue as to what OTHERS say your page (or at least something on your page) is about.
An outbound link gives a clue as to what YOU say your page (or something in or near the link) is about.
Much schema (especially with "sameAs" and other URL connected values) work this way too.
Google has never confirmed (and, as you say, has even denied), and I've never seen a patent that hints at it, but I can't imagine a link only has a unidirectional value. It just doesn't make sense.
Michael Martinez ✍️👑 » Truslow
Michael Martinez ✍️👑 » Ammon Johns
There were a few times after Matt made that statement about bad neighborhoods where he said they had algorithms that rewarded sites for linking out.
Of course, it's too easy to read more into that than he was saying.
But just the fact a page links out with some useful, relevant anchor text alters the page itself. Would it have used that expression without the link? I've long believed there are intrinsic differences between content that links out and content that doesn't link out. Even if they are hard to describe, they are there.
Google's algorithm has evolved. It's no longer about how many links you can get, it's about the quality of the experience you provide for the user with your content, including the content you link to or that links to you. There are guidelines like make sure you fill out your meta description, use the main keyword in slug, page title, etc. But honestly? Stop trying to game or cheat the system and just create valuable content–that's how you do great Search Engine Optimization (SEO) now.