Unique Keyword Research

Dolman
What percentage of searches online are not being reported by Google Keyword Planner (GKP), GKP Forecaster, Google Search Console, or any other keyword research tool?
Google says 15% of searches each day are unique and many searches with less than 10 searches per month don't show any search volume associated with them.
So… let's say that after you've completed your keyword research you end up with a list of 500 keywords that are searched a total of 10,000 times per month.
Which percentage of keywords and search volume do you find are missing from your research as a result of the unique and less than 10 searches per month being hidden from you?
3 👍🏽 3
[filtered from 21 💬🗨]

👈📰
💬🗨

Paul
No way of telling what percentage because no way of knowing what keywords (you may rank for) or how many keywords (you may rank for) that average under 10 monthly searches.
To get a percentage you would need the WHOLE figure, including figures for those averaging under 10 and these are figures not available so impossible to calculate that percentage.

Dolman
Not necessarily.
When completing the keyword research for a topic recently, I was able to find 636 unique keywords related to the topic (excluding geo-modified terms) using a number of different keyword research tools (UberSuggest, Ahrefs, KeywordKeg, Term Explorer) and processes (competitor keyword research, Google suggest, etc.).
When collecting the keyword research data (search volume, Cost-per-click (CPC), etc.), Keyword Keg only showed a search volume of at least 10 searches per month for 387 keywords.
If the 249 keywords without search data are searched an average of 5 times per month, that would mean the data is missing about 1,500 monthly searches.
In this case, the total search volume for the 387 keywords is 515,390 searches per month so the missing 1,500 searches doesn't represent a large portion (less than 0.5%) of the total searches.
But… when you look at the pattern in the keywords, you find lots of missing keywords…
In this case, I was able to create 2,490 different keyword formulas which represent the different ways people search when searching for these keywords.
In some cases, the keyword formulas have 2 variables and in others they have 6 variables. In some cases, the column with the variable has 1 variation and in others there are 46 different variations.
After running the script to insert all of the variables into the formulas and create the search query output, the total number of keywords found is in the tens of thousands.
Assuming those 20,000+ missing keywords are searched an average of 5 times per month, you're now talking about the missing 100,000 searches representing a much larger portion (20%) of the total search demand for these keywords.
That's a large portion of the data that's being hidden from webmasters by Google who doesn't report it as part of their keyword research and forecasting tools (Google Keyword Planner (GKP), GKP Forecaster, Google Search Console (GSC)) nor it's analytics tools (Google Analytics).
It's not like Google isn't aware of these searches. They're more than willing to allow their Google Ads advertisers to bid on these keywords when someone searches them… so why hide the information?
Let the webmaster decide if it's worth their time, effort, and energy to target a keyword that's searched 60 times per year or less.
💟1
Paul
Have no way of getting exact figures for ALL search queries averaging under 10 a month
Dolman » Paul
Correct… but as we've already been discussing in this thread, the figures Google does report aren't exact either, so why just hide the ones with less than 10 searches per month.
Paul
If less than 10 a month maybe just not worth the resources it would take to report all their stats
Paul » Paul
You may be right; that (economizing machine resources) might explain why Google decided some time back to move to volume buckets instead of exact numbers and, in the process, now shows no volumes for <10 average searches per month.
This does not mean these long tail search terms aren't worth pursuing. Let's first remember that these are monthly averages; if you are competing in a seasonal niche there could be terms that log ~100 searches in the month that matters then drop off. Also, as Dolman and others note, Ggl has indicated (and anyone that tracks their traffic knows) that there are indeed more searches for most terms than what is reported. Even the tool providers that incorporate clickstream data seem to underestimate reality (in some niches at least). If nothing else, these terms' very appearance in keyword/tool queries (via 'Related' and 'People Also Searched For') is Ggl telling us that there IS search volume.
What I don't know, Dolman, is whether your hypothesis holds up when you create new mash-up keywords with a merge tool or whatever (eg prepending "best", "cheapest", etc). In theory, yes, there could be even more long tail volume in aggregate but do you think Ggl is not just under-reporting volumes but also excluding entire tranches of keywords (and your exercise would back fill these lost keywords)?
Dolman » Paul
Not sure if Google would truncate an entire cluster of keywords (i.e. the group of keywords that start with "best" or end in "reviews"), but they're definitely under-reporting the total search demand.
While Google might provide data (inaccurate or not) for 100 keywords related to a topic, there are hundreds, even thousands, of long tail keywords (less than 10 searches per month) that go completely unreported by any of these tools (either research or web analytics)… the only question is: How much of the search demand is Google not reporting?
Would it just be the same 15% of searches they say are unique, or, as I am asking here… is it closer to 30-50% of the total search demand.
Since Google can't be expected to predict the future and provide data on keywords that haven't been searched, I don't think any webmaster would expect them to be able to provide data on the 15% of searches that are completely unique, but that's not the case when we're talking about the 15-35% of search demand they are aware of, but choose not to report.
Paul » Dolman
I think you are scouting down a really interesting rabbit hole. Maybe it's worth clarifying some stuff here (we may not know these answers but share thoughts plz)…
– Doesn't "unique search" mean that one human being (or bot) searched just once (or statistically equal to once) for that term?
[this may be defined somewhere…idk]
– Wouldn't that term STILL show up in 1) Auto-complete, 2) Related Search or 3) People Also Searched For?
[we don't know when/if Ggl 'drops' words from the db but keeping them 'forever' wouldn't cost much in machine cycles]
– What makes you think there are actual search terms that Ggl excludes from 1/2/3 above and/or their APIs?
[all the standard keyword tools (excluding LSI Graph-type tools) pull all Ggl term results (tho not all volume) from the same source]
– Sounds like you think that Ggl excludes some terms from their API *because* they declare them low/no volume, yes?
[consensus agrees volumes are under reported but I have not heard of actual search terms going missing]
To your main Q, there's likely no way to know (15% vs 50%). You might want to read up on "clickstream" (3rd party) data, which some tools use to supplement search volume data.
I love the long tail. There's definitely gold in them that hills. But I've contented myself with the logical conclusion that there IS more volume than reported. How much? Dunno..
👈📰

Dolman » Paul
A search term is only unique the first time it is searched. While it might not be searched often afterwards, any subsequent searches will not be unique since they will already appear in the database of search queries Google tracks. In many cases, the unique search will be a long tail variant of another long tail search Google has already seen before, which is why their RankBrain algorithm is reported as being so good at addressing those searches because it has 1,000's or 10,000's of closely related examples to draw from.
Since these keywords were never searched before, Google may not even show those auto-complete suggestions when entering the search query (at least not when at the end of the search query) or have any related or people also ask examples to include in their Search Engine Result Page (SERP).
When someone searches for the 2019 Tesla, the newest version of the iPhone, the first time someone or something is given a name or identity (new asteroid found), the first time something newsworthy happens that brings someone or something new into the spotlight (new celebrity or product) Google might not have a frame of reference or data set to refer to and wouldn't be able to provide the additional suggestions and related searches the do with more well-known search queries.
As far as Google hiding searches is concerned, their own website analytics tool bundles multiple search queries together into a single 'Not provided' result instead of reporting them individually and if you look at the patterns you find among the keywords you are able to find using Google Keyword Planner (GKP) or any keyword research tool, you'll find lots of gaps and missing data.
For example, if you research "educational toys" you'll find lots of search demand related to age and a list of search query variants similar to the following:
educational toy for 1 month old
educational toy for 3 month old
educational toy for 6 month old
educational toy for 12 month old
educational toy for 1 year old
educational toy for 1.5 year old
educational toy for 2 year old
educational toy for 3 year old
Even though the keyword tools might not provide any data on them, there are going to be searches for these variants too:
educational toy for 4 month old
educational toy for 5 month old
educational toy for 7 month old
educational toy for 8 month old
educational toy for 9 month old

educational toy for 36 month old
educational toy for 2.5 year old
educational toy for 3.5 year old
In terms of raw numbers, the number of keyword variants (not search volume) that aren't being reported in GA, GKP are going to far exceed the number of keyword variants they do report at a ratio of at least 10:1 in most cases.
Since there are so many ways to say the same, or similar, thing the question is how the number of people searching the long tail variants (in accumulation) compares to the number of people searching the more common short tail terms
If, on average, a niche, topic, keyword cluster, etc. that has 100 short tail keywords are searched a total of 100,000 times and 1,000 long tail keywords that are searched a total of 50,000 times, that's a large percentage of data Google is hiding from webmasters.
Paul » Dolman
This all seems logical and thanks for the detailed reply.
Re unique terms, the cases of brand new or created product names seem like good examples, and it makes sense that they would not appear in auto-complete, etc. Statistically speaking, the list of unique combinations (of otherwise common words) will shrink over time, but point taken and it's interesting to consider potential 'first mover' strategies around these.
Wrt the "hiding" phenomenon, thanks also for clarifying. Yes, that Google bundles similar terms in GA and Google Keyword Planner (GKP) is widely acknowledged and frustrating. It's not clear whether this is also for conservation of machine cycles or some more insidious reason, but it obviously happens.
>Question: do we know for a fact that the API the 3rd party keyword tools use exactly reflects this same 'bundling'? Haven't you seen cases where GA 'groups' certain terms but the keyword tools show more detail (keyword results) for that same cluster?
[I have not done this analysis so I don't know the answer.]
Regardless, in short I entirely agree that the potential for capturing additional aggregate volume from both these hidden 'micro-tail' and from 'synthetic keywords' (I just made those up…maybe they'll catch on!) is something worth serious consideration. When you include the strong consensus view that Ggl also uses natural language analysis when assessing on-page content (so, over time, exact match keywords become less impactful) it's interesting to consider how the algos will interpret such novel and unique on-page terms and word combinations.
Given that one could go crazy mashing up every possible combo only to end up with many non-nonsensical results, perhaps it best to use the keyword tools to look beyond Ggl. I have gotten good results from Amazon queries but the others Search Engines (SEs) often yield some new combinations. It's also interesting to try the non-English dominant SEs (Yandex, etc) for English terms as ESL searchers approach queries differently.
>Another Question: Have you ever tried taking a positive result (previously searched) from another search engine that did NOT appear in GA or GKP (apparently bundled) and then explicitly look for volume in GKP (w/campaign activated, of course)?
Fascinating topic!
Dolman » Paul
Not specifically re: positive result from other search engine.
With the keyword research process I use, I'll research multiple sources to find the different keyword patterns people use when searching and use them to create different keyword combinations using suffixes, prefixes, variants, etc.
During that process, I uncover not only the searches Google has already seen, but the searches they're likely to see based on the patterns of other searches (exactly what RankBrain does).
After creating the different keyword combinations, inputting the different variables into them, and creating the unique search queries, I can input them into Google Keyword Planner (GKP) or whatever other tool to get search volume estimates, but the vast majority of the keywords will come back with a 0-10 or 0 search volume and I won't know which ones Google has seen before and which ones they haven't.
In many cases, the keyword combinations are long tail variations of searches Google is already aware of and/or they follow patterns Google is already answering and these keywords can be bundled together into the same strategy as far as keyword strategy, content strategy, and Search Engine Optimization (SEO) strategy is concerned, but there isn't a good way to predict the impact of ranking for those searches when someone does search them until the page is optimized for the topic cluster and the keyword, content, and SEO strategy and shows up when someone searches any of the keywords in Google.
It would be nice if Google was more concerned with the accuracy and completeness of the information they provide to webmasters so they could make better decisions when helping Google provide Search Engine Result Pages (SERPs) for the search queries they're asked to address, especially the ones which don't occur as often as the short tail keywords they're more familiar with.
Why make someone go through the process of researching keywords and building a website to address those search queries with incomplete information only to have to figure out a way to fill in the gaps they missed once they're generating website visitors (even if they could since many keywords in GA show as Not Provided)?
Maybe it's because Google Ads advertisers are more than happy to pay for those long tail clicks and if Google happens to do a crappy job with a Search Engine Result Page (SERP) when dealing with a search query they've never seen before or don't see very often and don't have a lot of data to figure out the best SERP to display, the people searching those keywords will be more likely to click on the ads instead of the organic results?
Paul » Dolman
Your last note cuts close to the bone, I think. I doubt they *want* bad organic results, however they are definitely pushing advertisers hard (as is FB) to use the AI-driven (vs keyword-based) ad campaign systems. Cutting down on the supply of hard keyword data is one way to compel marketers and advertisers to give the new, shiny toy a try. But this same trend (AI and machine learning) may also bring a benefit to us, the hard-working, organic traffic miners.
As we know, pages that rank for certain targeted keywords also rank for many more related keywords that may not even appear on the site. It's generally acknowledged that this is Ggl getting ever-better at not just matching searches to the words on the page and simple synonyms, but also 'learning' what are the semantically related terms and then ascertaining optimal patterns, weighting, etc by tracking the on-site behavior of visitors. In short, I would say that if your synthetic keywords (never searched) still pertain to the topic and, in fact, positively contribute to the overall usefulness then there may be value beyond the explicit search volume (that they may or may not bring), in the form of incremental context added to the page (and increased relevance, as determined by the algo's 'brain').
Dolman » Paul
=================================================
In short, I would say that if your synthetic keywords (never searched) still pertain to the topic and, in fact, positively contribute to the overall usefulness then there may be value beyond the explicit search volume (that they may or may not bring), in the form of incremental context added to the page (and increased relevance, as determined by the algo's 'brain').
=================================================
BINGO!
Take this sample set of keywords and text:
bouncy castle
jumpy castle
bounce house
jump house
bounce house for birthday party
inflatable water slide for wedding
kids bounce house
toddler inflatable water slide
inflatable rental
bounce house rental with generator
"These days kids seem to have more energy then ever and when you get 10 or more of them together for a birthday party, wedding, or whatever other occasion, you want a place for even toddlers to jump and bounce that lets them have a fun time without having to worry about their safety.
Since we know you want your event to be fun and safe, all of our bouncy houses, jumpy castles, and inflatable water slides are inspected before setup and take down, rated for safety by ___ and 100% guaranteed to will keep everyone entertained for hours, even if you need to add a generator to make the fun last.
Our staff is trained to properly inspect and setup your inflatable rental for your event and help keep your guests safe while they bounce, jump, slide, and scream with delight."
=================================================
Written for bots, humans, and conversions…
In reality, these keywords wouldn't be grouped together and used to rank a single URL since this content would be better suited for a landing page or home page which covers these topics broadly and links into the pages that do target the topically relevant keyword clusters, but the point is the same from the standpoint of using keyword research to develop a keyword strategy which supports your content strategy which then supports your SEO strategy (which keywords to optimize, where to optimize them for on-page and off-page SEO) and Google doesn't make it than easier if 15-40% of their searches aren't being reported.

👈📰

James
I believe that a bigger problem is that the search volumes that you are getting are totally inaccurate in regards to genuine searches.
You will see that a search term gets 1000 searches per month, but in reality probably half of those are SEO users who are trying to rank for the search term, and all of the bots from search trackers who check on those search terms multiple times per day.

Dolman
Regardless of the number of bots and disingenuous searches, the search volumes reported are even different among Google tools – Google Keyword Planner (GKP) says 1,000 searches, GKP Forecaster predicts 1,500 and Google Search Console (GSC) shows 2,000 search impressions – let alone any non-google source for keyword research.
While I completely agree that all of the above will skew the real data, even if those numbers were all off by 25% (i.e. there are really 1,250 searches instead of the 1,000 being reported), I still don't see how that misreported data is going to amount to more lost data than the hundreds and thousands of 1 per month searches and unique searches Google has never seen before that are being hidden from webmasters by Google.
James » Dolman
I have an affiliate site that sells chairs from Amazon.
The website is top three for quite a few search terms that get a few thousand searches per month.
The search term that brings me the most traffic and makes me the most money allegedly gets 20 searches per month.
That about sums up the whole issue
Dolman » James
EXACTLY. Google knows how many times every keyword has ever been searched. Google has a list of every keyword that's ever been searched using its search engine. Google can probably even report the number of searches by geographical location… but they don't.
What is the point of hiding that information from webmasters, especially when it's their own data (GA)?
What is the point of providing confusing and inaccurate search volume from your own tools?
What happened to "Don't be evil?"
Clint
They removed that from their motto

GSmith
about 10-20 percent of searches online, do not even find their answer! so i believe you can say, Google does not report all the searches!

Dolman » GSmith
Completely agree and I think the percentage of hidden search demand might be double that.

👈📰

Read another article: 4 Common Mistakes I See People Make in Doing Keyword Research for SEO

RELATED POSTs

Leave a Reply

Your email address will not be published. Required fields are marked *