AMA: Strategy in Doing Keyword Research, Doing Concatenate Some Articles Into One Larger

u/deleted

I'm an SEO professional, AMA!

Hey guys!

I know a lot of bloggers have questions about Search Engine Optimization (SEO), especially technical things like URL slugs, how to do keyword research, or what kinds of optimisations exist and how to go about them.

I may not be able to answer everything because I focus on technical SEO, but please fire away!
84 💬🗨

📰👈
ModernKamikaze
What's your strategy in doing keyword research?

deleted
That's quite a broad question but I'll break down into some pretty defined steps:
👉 We set out ranking intent with our clients. So for example let's pretend they run a shoe eCom, we'll check what are they currently ranking for, what can be improved, and then which new keywords do they intend to rank for.
👉 If they don't know what they want to rank for we run their information through an Excel file with some super complex macros that generates a list of keywords from a single keyword. This is similar to Keyword Shitter (a great website) for generating keywords.
👉 All of this information is scraped from sites like SEMrush and other private systems. Once the keywords are generated we take them all and paste it into another file which scrapes the web for search related data, delta's, projected growth, and organic keywords in the market.
👉 If the the keyword "big toe" seemed to be a growing trend in search volume and was pretty low in competition , we'd mark it for our content writers to nail.

Of course this method doesn't work 100% of the time but we will always nail keywords that have good statistics anyway, so even if we don't hit that search growth boom we still have a good article aimed at a good keyword.

ModernKamikaze
Thank you for your in-depth explanation to my question.

deleted
Can I concatenate all of the content from a small website into one large piece of content, then post that on another websites blog, then 301 redirect the entire original site to the new blog post on another site.
As the title states, would you call this an SEO hack?

Can you create a small niche focused site with landing pages of content, then merge that content, transfer it and redirect all the SEO juice to another site where that merged content will be as one blog post?

I know this is kind of hard to understand but I hope someone in Search Engine Optimization (SEO) can give their opinion.

deleted
Ok so what I understand from that is:
👉 You take all of the content from a small site (Site 1) and compress it into a single page document
👉 You post that single page document on another site (Site 2)
👉 You then want to create a 301 redirect Site 1 to Site 2

Fundamentally, it can be done.
First of all you would need access to both sites' .htaccess, which allows you to create 301 redirects. Content under two different domains will be ranked separately, so having "duplicate" content on Site 1 and Site 2 will both rank. Therefore the huge information on Site 2 might rank better for "abc" while the fragmented Site 1 may have articles that rank for more niche queries.

The difficulty will be, with the 301 redirected pages, ranking Site 1 content, but I'll leave that to you to figure out.

There will be problems once Google figures out that it's duplicate content from two different domains however. They don't know that they're both your site, and therefore it appears to them that either Site 1 has copied Site 2, or vice versa.

I'm afraid there's no way to really combat this, because Google will penalise either Site 1 or Site 2 based on which one has the better Domain Authority (DA). The lower DA website will be penalised for having the duplicate content.

Usually a canonical tag would be usable but those are reserved for internal to your domains content. This would not work if you're working with Domain 1 and Domain 2.

However,
What I have been playing with, which might be a solution to this, is using Structured Data Markup https://seospidre.com/structured-data-markup particularly JSON-LD. JSON-LD allows you to create a sort of "profile" for your website through a script in the <header>. One of the things you can state is "asSeen" which states that there are related websites, related in a broad way.

Hopefully that helps!

followme
Rel canonical tags are only for internal use? Are you sure? That's how people syndicate content for news sites/podcasts etc
jaabathebutt
Yeah, I think Medium also uses Canonical tag if you're reposting your content on it
deleted
Ok so rel=canonical was originally developed to reduce duplication of content across the web, but it quickly backfired and I will explain why:

When content is published it comes with the rel=canonical tag in order to be identified as the "original page." This was meant for all domains, so when the rel=canonical was registered by Google it came with a timestamp. This allowed Google to find the "original" by identifying the first timestamp, and it failed for 3 key reasons:

So for this example let's say Site 1 published article X. Site 2, 3 and 4 copy it word for word onto their own sites. Site 1 has the first timestamp, with 2, 3 and 4 following.
👉 It assumes that all information is indexed simultaneously. Because Indexing requires crawling, which can take days or weeks to complete. Site 1 may hahe published content first, but if Site 2 gets indexed first, their canonical tag is the first to be registered and boom they're the originals.
👉 Changes to content result in an updated rel=canonical. If site 1 was indexed first, but made changes after Site 2 and 3 were indexed, when Google comes back around a second time it will find that Site 2 is the REAL owner, which doesn't make sense. To combat this they tried a few things but people would just spit out thousands of articles a day to have the files indexed and then make changes later. This would enable them to have the 'original" stamp for thousands of common words and phrases simply by being first.
👉 A lot of information on the web over laps. If Site 5 accidentally has a few similar sentences, Google could heavily penalise them for having duplicated content. Site 5 would never have done anything wrong, but be sunk in a Search Engine Result Page (SERP) for real content they generated. Playing with penalisations is important, but penalizing a site on the premise that it may have copied info is not a justifiable action.

So what Google decided was to seriously enforce Digital Millennium Copyright Act Policy (DMCA) notices instead, and reserve rel=canonical for marking internal content as duplicated as a courtesy to Google.
jaabathebutt
Yeah, that explains the duplicate content concept. Thanks man :) I'm into SEO as well. We should stay in touch.

📰👈

MrSwaby
Which services or tools should every site owner be paying for if they want to have great SEO on their websites?

deleted
Hmm there's a few ways you can answer this.

Would you be looking at tools for a Site Owner who plans to do his own detailed SEO?

Would you want tools external or internal to your Content Management System (cms) / hosting platform?

I could also give you some tools that an Agency would use (but it might be a bit pricy without the economies of scale an agency has compared to a site owner).

MrSwaby

"In Terms of real, down to earth SEO, not considering User Interface (UI)/User Experience (UX),"

I'm mainly coming from the perspective of a small-time blogger with a few sites looking to do his own detailed SEO. And I mainly use WordPress or straight HTML5.

I suppose Agency tools would be too far out of the budget.
deleted
So after accidentally deleting quite a long post, here is the rehashed shorter version.

Paid tools: use SEMrush under the most basic package. $100/mo but is an essential.

Free tools:
👉 https://serpstat.com – Variety of SEO tools
👉 https://lsikeywords.com – Latent Semantic Indexing (LSI) keywords
👉 https://smallseotools.com – Variety of SEO tools
👉 https://who.is – Find webmasters
👉 https://datastudio.google.com – Data Visualisation
👉 https://ranktrackr.com/ – Keyword Rank Tracker

Technical Tools:
👉 https://seospidre.com/structured-data-markup – Writing JSON-LD
👉 https://www.xml-sitemaps.com – Cleaning up your Sitemap
👉 https://www.ssl.com – Secure Sockets Layer (SSL) Certification
👉 https://developers.google.com/speed/pagespeed/insights/ – Web Load Speed
MrSwaby
Shame about the accidental deletion, but thanks, I appreciate the info.
cuamo
Why you prefer SEMrush over Ahrefs
deleted
Honestly its up to preference. People will go on and on about the slight variation in their functionality but they're almost identical tools. I just like SEMrush
cuamo
Well, for backlink analysis SEMrush is pretty much useless as their index is worse even than Moz's one. And since those tools ain't cheap if I had to pay just one, I'll pay for the tool that cover most of my needs.. and that would be Ahrefs .

cats
Does working on a site with several topics make sense vs creating a niche one? I have a digital agency site with a blog that is all over the place just because it allows me and several other people polish their writing skills across multiple topics (coding, marketing, security, web design, etc) and helps to quickly figure out what niches and topics are on the rise and drive traffic.

But it's probably a bit confusing to the readers (content is not consistent with any niche, though all postsc are about web and are of great quality). 20 high-quality posts on a niche site are a great asset that you can already monetize, while with 20 posts on an agency site you just know more about how hot various topics are. Does it make sense to proceed like this? Or it's better no narrow down to one niche asap? Or we should use this website to proceed figuring out hot topics and branch out niche sites for the best them?

deleted
Ok so:

You don't need a multi-niche blog to be able to test ranking volatility or growth. Trends.Google is a very simple tool for search growth measurement. SEMrush Sensor https://www.SEMrush.com/sensor/?db=MOBILE-ES&category= is a volatility measurement tool, which will let you know what regions / sites are experiencing algorithm updates.

In terms of real, down to earth SEO, not considering User Interface (UI)/User Experience (UX), writer preference, etc, it works like this:

Having a niche site allows better inter-linking content, better Call to Action (CTA), better on-Page optimisation, and technical optimisation can be applied once to an entire domain for the content. A niche site will improve the ability for your site to garner external links, create Ecosystems https://seospidre.com/seo-ecosystems/, and generate awareness towards that specific DA. It will provide for better URL optimisation, slug matching, and "user research" to target your specific consumers. It will even help with Local SEO a tad, by identifying your users and their geographic locations. Single niche sites also really suffer from some algorithm changes, and are super dependant on user-interest. If people stop caring about abc, then you cease receiving traffic.

Having a multi-niche site allows:

Significantly better flexibility, if you see content shooting up and around, or you have a great idea for an article, you'd be capable of publishing it. This will increase your statistical probability of having a one-time hit article, rather than having lots of mid range articles. Multi-niche sites struggle garnering external links from the lack of awareness in specific areas. Your testing theory can occur through multi-niche sites, and is great for knowing after demand changes occur. For demand testing I recommend setting up a satellite website that focuses on one-two niches that you will know is either shooting up in traffic > hence the demand change is occurring, or suffering and thus negatively changing. With satellite sites you can make on the go adjustments to your primary site (multi-niche) to accommodate the improving changes.

I believe this is very resource dependant, and if you can afford to have a multi-niche with one or two satellite sites then 100% do it. But if we considered everything equal with no satellite sites, I would still recommend having your content published on a single multi-niche site with good navigation and a great sitemap.

cats
Thank you! Very helpful.

deleted
Are there any attributes of Search Engine Optimization (SEO) that cater to a higher link click through rate once a person has arrived on a page? For example, I'm aware keyword research will help drive traffic to a page (say "best grills for sale"), and assuming the page is ranked well on Google, once a person arrives on site is there anything (from an SEO perspective) that can be done to help move people from clicking the link in the Search Engine Result Pages (SERPs) to moving on to affiliate or internal site links more effectively?

deleted
What you're referring to here is ToS (Time on Site) / Bounce Rate Optimisation. There are specialists that exist out there that will actually audit and consult on this particular thing.

While SEO doesn't actually touch on this I've found that the style of your writing has definitely impacted a clients' retention rate. So for example you're selling something (as an affiliate), you don't come in with a "hey man you need this product!"

What the more successful clients do is they teach the user about problems they have / don't have, and then subtly introduce solutions, one of which happens to be their own.

A good copywriter will be able to do this as well, but honestly this is not my area of expertise, sorry!

deleted
How can I outrank a site that's already ranking for my target keyword?

deleted
My friend that's Search Engine Optimization (SEO) for you! Beating competition out of something that they already rank for. Now there are hundreds of ways to do this and this is what good SEO really is, pushing out your completion. So instead of rehashing tons of information you can find online I'll give you a cool thing you can do to leverage their already existing SEO.

Now I'm afraid I'm on Mobile right now so I cant hyperlink it but the term is called Barnacle SEO. You effectively target their keywords and use their brand name in order to leverage yourself into their searches. What happens is you start sucking up 5%, then 10%, then 15% of their content searches.

You would run a Barnacle strategy alongside your standard content creation, off-page, traffic acquisition schemes in order to hot rod your way into the top 10. From there its about beating THEM and squeezing out the last 5-10% optimisation with technical optimisation.

Barnacle SEO: https://seospidre.com/barnacle-seo

wrath
oh and some more questions if you have the time 😅

we are struggling in answering some of that for ourselves and are just a little bit lost:

1 Instead of chronological order – I started to randomize our category pages every month – could Google penalize me for that? ( e.g. https://madewithvuejs.com/charts )

2. Should I maybe add a updatedAt date somewhere on overview/category pages too? Or will Google ignore that?

3. We have obligatory "top-list" – articles – is it good to add a year to it? like "Top Components in <year>" – or better spring/summer/autumn/winter edition or even a monthly update?

3a. When I update an Article and update the Date of change, might Google see it negatively if the Update was rather small? Should I then not Update the date every time? Or is it fine to do so?

If this is too much pls ignore me and get onto the others :) – thank you so much for doing this Ask Me Anything (AMA) – only reading up on Search Engine Optimization (SEO) via google/blog posts is hard!

deleted
No worries!
👉 So I assume when you say "randomised category pages" that it means the content under each category is in a non-chronological order. If that's the case, then nope don't worry about it. Google will detect it but doesn't know what to do with that kind of information, it's not boosted nor penalised.
👉 An updated date on a category page as plain text may very marginally improve time relevancy of your page. This is something that, if it did work, would improve your SEO very, bery marginally.
👉 So adding a date to your top articles is a great idea and I would highly recommend it. The timestamp in this event would relate more to your users and what they would be searching for. Are you running something quite volatile where they will want to see weekly or monthly updates on content? Or is it something of greater grandeur where people don't really care if they see it or not. It the former, publish content more frequently with more precise times (like May <year>), and if it's the latter use larger periods like <year>.
👉 I would recommend update the Changed date whenever you make a change no matter how small. Google wont detect this and not penalise nor boost it. However having a recent date will help with your page relevance!

Hopefully that answers all of your questions in enough detail :)

yonduudanta
I'm learning Web design currently, (HTML, Cascading Style Sheets (CSS), Bootstrap). My question is are there any tools or methods from Search Engine Optimization (SEO) that I can implement in my web design to offer my client a unique edge?

deleted
If you can learn SQL or User / Server interactions (like HTTP Response Codes) and network knowledge it will give you a tremendous edge.

Reason being, is if you say "hey man your critical file call is diluted, you've got 80 chains before its finished." Your client is going to say two things:
👉 wtf does that mean
👉 How do I fix it

If you really understand file calling, directories, and networks you will be able to explain it easily: It means critical files to the loading of your website are being called after some peripheral stuff. For example the "backbone" of the site is being called after a few of the images that are lower on the page."

And then to fix it, you need to reorganise their theme directory which requires a knowledge of HTML, CSS, and a bit of PHP if you could do that.

So SQL + PHP would be a good next step for you

dudeblog
Hi, I run a cryptocurrency blog. I was previously on page one for lots of different searches. After the June algo update, my impressions and clicks went down over 90%. Others who run smaller blogs in the sector are also telling me they got hit hard. The interesting thing now is Google is mostly ranking crypto services providers with attached blogs on the top pages now even if they have little engagement, thin content and haven't been updated. I first wondered whether I'd been hit by an algo penalty as someone was spamming one of my blog posts with 1000s of low quality backlinks. It's hard to tell what the issue was. So I took a proactive approach; disavowed spam links and added terms and conditions, cookies policies (things which I maybe missed before that could damage trust). What other strategies could I use to restore my site to its past strong seo performance?

deleted
Ok so there would be two strategies depending on the cause:

Do you believe this was as the result of black hat SEO, and thus penalisations have been incurred?

Do you believe this was just an algorithm update that re-ranked your content?

The first would be easier to recover from than the latter, because the latter will require a bit of testing on your front. The recent June updates were pumping up security and reliability of information, so I can imagine that domains within high volatility niches like cryptocurrency are at high risk.

Something which you can do which worked for one of our clients was take their single niche crypto blog and add three more niches, making it semi-multi-niche. They were related topics, so biotech, economics, and programming, but the diversity helped them pull out of a spiral.

That might work for you too IF you were only impacted by an algorithm update. At our agency we also run SEO Security where we scrub domains and remove all the penalties, so give me a shout if you want more info on that

dailypupp
I have researched keywords quite extensively and I am comfortable with that side of things, and I have a lot of content in development which is good quality. The site is 3 weeks old and ranking 20-100 in Google for low value keywords but the next group of articles will be targeting very valuable keywords with low competition. Is it worth paying for some backlinks to kick off these keywords? And if so what services do you recommend?

deleted
Paying for backlinks can be super hit and miss. Sometimes you get a great service that will drop 4-5 high Domain Authority (DA) links to your page, shooting you up. Other times you pay for 20-50 mid DA links which all turn out to be super fraudulent sites, and it will tank down your whole domain.

I never recommend buying links for the sole reason of risk. Technically it shouldn't be done across the internet according to the Webmaster Guidelines, but I mean it works, right? I don't have a particular service that I use for any of this, because of our clients reach and diversity we can interlink our content between clients pretty well.

If you're interested in something a little bit more white-hat, drop me a message and I can see about how we get you involved in one of our SEO Ecosystems. https://seospidre.com/seo-ecosystems

eric
Last year I wrote an article and it was ranking very well. We were getting a lot of good targeted traffic. Now my article has been taken over by the competition. Recently I have updated the article with latest information and bumped it up to 5k words with good layout. I have created table of contents to make it easier for users to find relevant information. I have update the old article and published it. How long would it take to reflect on google. I mean my new content is the update on the older one. Did I make a mistake here? What is the best practice to gain back your previous position? Please help. 🙏

deleted
So what's happened to you is SEO amortisation. I'm actually busy writing an article about it as we speak!

Users are provided the most relevant response to their query. Which means sites are based on a "relevancy" scale. Amortisation refers to the depreciation of a non-tangible asset, so SEO amortisation is the loss of "relevancy" due to particular factors. One of the big factors here is time. This of course depends on your "amortisation rate" which is based on your content.

For example if you're running a news platform, and have specified that with Structured Data, or <footer> information, or however, then your content automatically amortises much quicker.

Now because of recent algorithm updates, having a single article with 5k words will probably do more damage than good. Google is leaning now more towards smaller, concise content which provides specific answers for specific keywords.

It could also be a geographic thing, better competitor optimisation, higher DA's, maybe external links to that specific article. So for you to check, here's some things to quickly check about your competitors:
👉 Are they using different keywords than they are?
👉 How are their meta titles structured? Is there something you can use there?
👉 How does their meta descriptions differ from yours?
👉 Is their content laid out differently than yours?
👉 Run a few scans on their DA, age, and general content
👉 Check out if they've been running any illegal link-building schemes which you can plea to Google
👉 What's your web load time vs their web load time?
👉 Do they have technical optimisation which they don't?
👉 Are they targeting SEO locally or taking a global shot?

Just some things that will allow you to compare what you have! Regardless of your competitors, our client sites have been ranking better for concise niche articles, so it might be worthwhile testing some broken-up pieces of content.

shakiredditor
Hi,

Thanks for your effort! :)

I have two questions:

(A) – This is about interlinking. I am writing a lot of sneezing related articles on my site and I am thinking about linking every 10 of them from the rest 9 and do it for every post. Will it hurt me?

(B) – When building links for your site, what metrics you look at to make sure site wont hurt your ranking?

Thanks again for this! =)

deleted
A) Internal links are super important! They help with a ton of SEO factors and will definitely improve your content ranking overall. Google seems to recognise up to around 12 internal links, and after that it doesn't matter so much anymore, so if you can hit around 12 internal links per post that's great.

B) In order to check the validity of external links, check out their site and ensure the following:
👉 They have SSL certification (HTTPS, the little lock icon in the top near their URL)
👉 They have a variety of content
👉 Check their comments per post, make sure its not filled with spam
👉 Check if you can create account profiles, and if you can, check a few of the profiles to make sure they're not randomly generated

A thing I've noticed is that if a competitor is trying to take you out with black-hat SEO by sinking your ranking, you can tell because they're always targeting a specific page of your site. For example you'll have a few hundred new random backlinks coming from a variety of sites, all pointed towards your home page: "/".

If you see a sudden increase in back links, and all pointed to a single page on your site, and you THINK they're fraudulent, its best to "disavow" them, and make sure that Google knows you're not affiliated by them. Check out SEO Security actions here

littlelion
Link building: what's the best approach to it?

In the past, I've focused on writing guest blog posts for high authority websites but would like to diversify my link building strategy.

TIA!

deleted
Blog posting is a good place to start.

Google has made algorithm adjustments to improve the ranking of sites with infographics and internally generated graphs, and pictures. Coincidentally, it also happens to the be the kind of content that replicates the most quickly throughout the web, ranking below sensational media.

So what most people don't know is that if you're the original publisher of an image, it always include a small tag which links it to your site. Not necessarily in the sense that you can click on it and it hyperlinks to yours, but more of an "indicator" to Google that this image has been reused, and comes from xyz source.

That small tag there actually counts quite a lot for image ranking, which occurs entirely separately from website ranking! Image ranking also spills over to website ranking (marginally) and therefore just by having good infographics that get copied over onto other sites, can already begin to boost your DA by loads.

There's other strategies like buying them, using Web 2.0 sites, expired domains, creating accounts on other sites and linking them back to yours, and guest blogging (which you're already doing).

However, the image thing is very unheard of and carries great results, give that a shot!

littlelion
Thank you!!!! Honestly, I've never heard of the image thing you mentioned so that's definitely something I'm going to try.

In terms of an indicator, do you mean something like including your logo on the image or adding your domain as a text overlay onto the image?

Thanks again!
reigorius

So what most people don't know is that if you're the original publisher of an image, it always include a small tag which links it to your site. Not necessarily in the sense that you can click on it and it hyperlinks to yours, but more of an "indicator" to Google that this image has been reused, and comes from xyz source.

I'm a little confused. Do you, as the blog owner and creator of the image, create a link to your own site? Or do you hope when you image gets used, someone will tag your site? Also, where does that tag go, how does it look like
deleted
So since RankBrain (an AI snippet in Google's code) was added late <year>, Google got pretty good at image recognition. When an image is first uploaded to the web, and indexed, Google remembers the file name, and breaks the image into what I believe is 4×4 pixels.

So with your https://website.com/pictures/file.jpg, and your semi-recognisable 4×4 pixel image, Google moves onto the next site! If it indexes a similar file name, so https://website2.com/pictures/file.jpg it says, "hey didn't I just see that?" So it goes and quickly compares the 4×4 pixels of the new image to the other image. If they match then Google makes a small footnote that says "this image comes from site 1 under file.jpg." Over time these can accumulate into quasi-backlinks, just like social links, to your site.

There is no identifying information in the <meta> information of the image itself that allows you to do this. Now I'm pretty confident this is still in use due to the recent updates focusing on information privacy, Digital Millennium Copyright Act Policy (DMCA), and ownership.
reigorius
Great stuff! So basically you don't need to watermark or put an banner on the bottom of your image for Google to recognise the origin. You just need to be first to get indexed, right
deleted
Correct, but I would include it in the event you need to make a claim or just for branding purposes. A lot of high Domain Authority (DA) sites don't do watermark or anything in order to encourage content reposting, that way they build themselves up every time it happens.



The Perfect Strategy to Keyword Research in 7 Simple Steps + Increasing Rankings and discovering New Rank Opportunities

The method of posting in GMB that affects ranking | Keyword Research, Content-Type

Unique Keyword Research

4 Common Mistakes I See People Make in Doing Keyword Research for SEO

Allintitle Keyword Golden Ratio (KGR)

A GMB Case Study How to Structure Location and Service Pages GMB URL and Keyword Density Optimation

Topical Cluster Keyword Mapping Winning Formula with Pillar (Post|Page|Content)

How do some Tools Calculate the Keyword Difficulty? Which One is Relatively more Reliable? SEMrush|Ubersuggest|Kwfinder

Can a 6-Month-Old Website Rank for High Competitive Keywords?

How Long did it take to Rank the Most Competitive, High-Volume Keyword into the First SERP?

RELATED POSTs

Leave a Reply

Your email address will not be published.