Is there a benefit for Search Engine Optimization (SEO) if I link out to relevant sites but they don't link back nor do I expect them too. It would not be a competitor but I am always concerned about sending someone away from my site.
An example, if I did an article on my website about historic homes in specific city and link out the city website that expands on that.
[filtered from 7 Answers]
Yes, it's an effective way to boost local relevance.
This is the exact strategy industry giants like Expedia and Travelocity use to establish local relevance for each city they go into.
Create a local landing page for the city you're targeting.
Create some posts that are hyper local. Talk about that city and interlink them.
Here's 3 examples for New York City (NYC) – NYC's Top 7 Attractions, The 5 Tastiest Italian Restaurants, 17 Reasons NYC is the Place to Live.
Mention specific attractions and places within the target city so Google can establish where you're talking about without having to repeat the target city in keywords.
From here link the articles to the city landing page and link out to relevant external sources.
Create a profile on the local chamber of commerce website. Sometimes there's more than one. In Miami, FL for example there are 4 different ones based on the neighborhoods within Dade County.
You can also reach out to local directories by city. Even if the link is No-Follow, it still has value because it passes relevance which helps you rank locally.
"I am always concerned about sending someone away from my site."
They are going to leave your site one way or another. Send them somewhere good so they'll come back for more.
Let's jump straight into it:
1. Google loves good outbound links as they add value for the readers.
2. Outbound links pass juice. That means other pages of your site that are internally linked from the particular blog post might receive a very little juice.
Isn't that how Google works? If yes, How do you balance those things? What's your strategy?
Thanks in advance.
Martinez » Naimur Don't do it for Google. Don't do it for SEO. Do it because you want people to appreciate your site and recommend it.
Google is too complicated to try to second-guess and reverse engineer. Just focus on creating the kind of site you yourself would recommend to other people, if you found it in random surfing.
Be a helpful resource.
Raymond » Martinez Good advice but I have to satisfy my curiosity. So if I know it will help with Google juice I feel less concerned about sending them to a site with more info risking them not coming back. Trying to find the balance of when to do that or perhaps me elaborating/paraphrasing while giving attribution. Kind of depends on the content I suppose.
Martinez » Raymond All Googlers have ever said about linking out is that some parts of their algorithms "reward" the practice. They've never indicated what that means.
I think following your curiosity is a good idea. You'll learn something from the experimentation, even if it isn't always what you hope for.
Naimur » Martinez Thanks so much. It was really helpful for me. I'll follow your advice for sure. Have a good day!
Kristine » Raymond it's really simple just link to authority sites like government and education or academic — no one's leaving your site to go there unless they want to check out your citation or resource.
And make sure you put in your citations on anything that you're writing about informationally.
Again those are going to be academic or informational sites that are not likely to have your users leave to go check out.
Think like you are writing a paper.
I always dig your inputs in this group, Michael – can tell you're speaking from experience and a long term/strategic perspective.
Daniel » Oliver it is wise to use some 3rd party citation to back up your article but a few things. 1. Always yes the "rel-noopener" tag so that no one is taken away from your site and instead a new window opens up for 3rd party page. This eliminates what would have looked like a bounce and keeps them on your site longer while they check out the 3rd party article. Also be aware it does allow your root domains total link equity to be dispursed to some small degree so it should also use the no follow tag unless you are trading link equity with that 3rd party. So crawl budget does apply here too. You want to retain as much equity as possible which is why you want to make sure you no index as many pages as possible that make sense like have no reason to be in the Search Engine Result Pages (SERPs). Same concept. It's about crawl budget and not diluting your site wide link equity reserving it for most important pages instead. Go linking up the third parties all over the place without a no follow and you will hurt yourself in the ranks for sure because you're giving it away .. the equity without getting it in return. So that's it …😁
Martinez » Daniel " Also be aware it does allow your root domains total link equity to be dispursed to some small degree so it should also use the no follow tag unless you are trading link equity with that 3rd party."
This is a common misunderstanding. You cannot hoard PageRank. The way it works is that it flows from document to document OR from document to index.
Whether you link out or not, it doesn't stay on the Website.
Another common misunderstanding is that using "nofollow" prevents PageRank from flowing out. Google changed that behavior in early 2008. They deduct PageRank from the page's outflow for every link, regardless of whether it uses "nofollow".
Daniel » Martinez I see. How about this example say you have a hundred page website and within 2 months at another 100 pages but have no further off-site SEO equity to offset your ranks will decline will they not? In other words whatever equity was incoming is now diluted and dispersed amongst 200 pages instead of 100.
Martinez » Daniel There's really no such thing as "link equity". It's a pseudo-cool phrase for PageRank-like value. The problem is, we don't know which links pass value or how much. So when you try to create a formula for estimating where the value flows, you don't know if you're dealing with variables of zero value.
There are better things people can do with their time. That's why I talk about "managing crawl" (not crawl budget) and "crawl pathways".
You can create the channels you want visitors (people and crawlers) to use to find more content on your site. You can shape those channels to show those visitors what is important.
That's less precise than dividing link equity, but it's a more reliable way of optimizing for search and user experience.
Daniel » Martinez Ide like a deeper understanding of this. I have only ever done local SEO, but of late the sites I was working on had become big enough to where they were beginning to appear international. I'm pretty sure but not sure in what way your url count being really high helps international SEO. I do know however in general what we're talking about having a mass amount of URLs requires quite a bit of management and thinking and strategy in terms of thinking about how it will be crawled. Whether it's crawl budget as you described or crawl budget as other people think as you indicated either way.
Martinez » Daniel Only the search engines control their crawlers. Anyone who thinks they are controlling a search engine's crawler is fooling themselves.
You manage CRAWL but not CRAWLERS. I'm not surprised people are easily confusing the two concepts.
You can tell crawlers not to fetch pages via "robots.txt". Good crawlers honor those directives, bad crawlers don't.
You can tell search engines not to index pages. Good search engines honor the request but they have to crawl the pages to see the request.
You can put more links on every page to help the crawlers find more pages, but whereas that's good for SEO crawlers the search engines say that makes it more difficult for them to figure out which pages are the most important.
You can put fewer links on every page, but then the search engines have trouble finding deeper content.
All you can do is create links because your documents in such a way that you feel the user interest, search engine interest, and any other interest you care about is served about as equally and fairly as possible.
And then you make adjustments as you find the need to. That's what "crawl management" is all about: making choices about link placement and destinations. And it has absolutely nothing to do with "crawl budget".
Daniel » Martinez It's the usual than "it depends" … because this will therefore be so very subjectively orientated to the site's industry/niche, and the dev's intent as to what they are trying to accomplish. If say you are very much in a niche, then it's likely there is a lot of "low-hanging fruit" around. Therefore, search engines would have a far easier time understanding what is relevant or not. However, in the mainstream industries, the game would be completely different I assume.
Martinez » Daniel It has more to do with the size of the site. Search engines *MAY* have some niche-specific crawling logic, but I doubt it affects most of the Web.
Daniel » Martinez well you know by low hanging fruit I was indicating that a site would have an easier time regardless because it's a lot of easy stuff to rank for. So that's the discernment I was making when I said niche market. So with such a niche market site engines will be less prone to be confused by anything because there's not 100 million other pages on the net competing. But if you're in a mainstream industry then you'd probably have to really be more strategic and careful. and no I don't think it's logical to simply always be super strategic and careful because time equals money. So I think it's a valid discernment to make it's just good old-fashioned keyword difficulty I'm talking about at the end of the day I suppose but wrapping it in much deeper context I guess. I like to talk about far out theory. Ponder the possibilities.
Martinez » Daniel I see what you're saying. You're right.
So much good info here, thanks everyone.
This may satisfy you: Are Outbound Links Important?