The Users feel these Tools Help up Click through Rate CTR from SERPs

Discussion 3: The Users feel these Tools Help up Click through Rate CTR from SERPs
James Dooley 👑
I am running tests on every Click-Through Rate (CTR) bot or tactics there is for click through rate.
Please can you comment what you feel is the best tool for this?
No need to mention running paid ads, Microworkers, Mturk, Serpclix or Serpempire because they've all been tested.
I want CTR bots or owners of people who own a click through manipulation software.
Get tagging people or share links to tools please!!
I will post the roundup once all have been tested with the results.
[filtered from 57 💬🗨]


I haven't used this one but a friend of mine has as he loves it.
Traffic Bots – Quality Traffic and Ranking of Your Site in Top

James Dooley 👑✍️ » Petkov
cheers. Do you know the owner by any chance? Or does your friend?
Iv also got plans to invest in some of these companies so want to speak with the owners also where possible.
If not that's cool but be great if you knew 👍
Petkov » James Dooley 👑
I will ask my friend if they know each other.
James Dooley 👑✍️ » Petkov
perfect. Hope you're keeping well? And look forward to your response.
You still killing it with SAPE pal?
Petkov » James Dooley 👑
yep, still killing it with SAPE. ☺️
James Dooley 👑✍️ » Petkov
whats the main use people using it for? Direct to money of as tier two?
And where do you order your service? As used you plenty of times and sometimes get asked so I'll share your order url 👍
Petkov » James Dooley 👑
mainly to money sites. I would say 90% of clients use them on money sites. Thank you, and I appreciate it.


James Dooley 👑✍️ » Gary
just googled it “FOLLOWING CTR BOT” but can't find it 🤣😂🤣
Gary » James Dooley 👑
I am playing around with this one but not really dug into this much.
Free SERP checker – google ranking check |
James Dooley 👑✍️ » Gary
this is for tracking your ranking isn't it? Does it also do CTR manipulation?
Gary » James Dooley 👑 i meant 🙂 same thing lol
James Dooley 👑✍️ » Gary
haha aaahhhh ok yes they were crap when I tested. I used to use them for a while. Here was my review on them
I moved from Serpclix to Serpempire as they was better.
But tbh I'm looking elsewhere as both got restrictions with their software.
Gary » James Dooley 👑
i am playing on a dead site with a couple. The one someone tagged above looks class, but expensive.
James Dooley 👑✍️ » Gary
depends on the results it brings you. If it ranks your website it could be awesome value for money. Which one are you thinking looks class? The one?
Traffic Bots – Quality Traffic and Ranking of Your Site in Top
Traffic Bots – Quality Traffic and Ranking of Your Site in Top
Traffic Bots – Quality Traffic and Ranking of Your Site in Top
Gary » James Dooley 👑
Yes that one, looked through the Demo videos and like the look of it.

Hi Ravi, can we ask our Python guy to create one for testing purpose but do run this via different browsers/Google TLDs/delays and rotating proxies. Share results with James Dooley and myself too please. This can be an interesting only and lot more better than Microworkers I guess.

Do involve in script the features covered in too.
Traffic Bots – Quality Traffic and Ranking of Your Site in Top
James Dooley 👑✍️ » Rohin
hey matey. I've tested this but google are on the ball with click / mouse movement. Be awesome to bang heads and see how it can be done though.
Ravi » James Dooley » Rohin
I am already testing a bot that o got made to manipulate CTR + Brand keywords for Google My Business (GMB). This tool does a new sequence every time it runs… sometimes it will open the reviews, sometimes press the call now, occasionally go to the website and few other things…the time it keeps the Browser open is also different all the time…but its still in development..will keep you posted James if i see any value coming out of this test.

Роман Морозов (Userator)

but have never seen any positive results though
James Dooley 👑✍️ » Olesia
ok do you know anyone from USERATOR?
And what tests did you run on it?
Just search, click your site and engage?
Was there any method you followed like the percentages used depending on your position?
Did these searches log in your search console?
Did you see reduced bounce rate and increased time on site?
And was USERATOR certainly using google search or was they using the built in browser url bar which defaults to google search (as this 100% does not work as good)?
Love your feedback on this 👍
James Dooley 👑✍️ » Olesia
here is why I don't like most CTR bots –
It can't be in the built in browser search for this to work (even though this still shows in search console).
Olesia » James Dooley 👑
they've started proposing google not so long ago – have not tested that yet. They've invented that pickup SEO method when guys approach girls and ask them to search and click to a particular search result from their phones 🙂
Meaning, have not seen positive results from other people and didn't test on my own properties for that reason.
James Dooley 👑✍️ » Olesia
okay I'll test them out. Thanks 👍

I don't think bots will do the job, you need first a browser with all footprints in check (Canvas, WebRTC etc.) and then clean residential proxies, i have now 200 accounts i manage, regular normal usage, if you want, we could test it out for CTR.

A fair proportion of chrome browser users , logged into established Google accounts, is probably another check they do to screen out bot manipulation.
Yep, when you are logged in, Google gives you much more credibility.
Luke » Pablo
true true, wouldn't it be nice if someone could create a decentralised system for distributed computing on peoples genuine footprints, controlled by interaction through the hardware or driver level to disguise any and all automation with an open and transparent operating procedure ensuring the end user was in control of how their digital footprint was used 😛
James Dooley 👑✍️ » Luke
can you do this? And do you have any tools for CTR? You'd be the man to create one 💪
James Dooley 👑✍️ » Pablo
is there any service that provides what I'm looking for without me needing to get the proxies etc

Tony – Abbas Ravji
Kelly At Creationshop
is building one right now.
Boost Click Through Rate (CTR), Increase Time on Site & Reduce Bounce Rate – Google Rank Brain

James Dooley 👑✍️ » Tony
have you used and what did you think of it? Craig Campbell 👑 used it but think he's looking for other CTR bots as not good enough.
Boost Click Through Rate (CTR), Increase Time on Site & Reduce Bounce Rate – Google Rank Brain
Tony » James Dooley 👑
I agree with Craig. Kelly is working on something better

FB Ads in the zip code has been key for ranking lifts – In terms of micro workers, just take out a Craigslist ad or fb ad in the area you want to focus in and pay them accordingly… CPC for a lawyer is $35, I am getting a "CPC" for $1.25 w/ dwell rate of over 6 minutes and the client is paying a monthly retainer over $2k…

Can you expand on all that? As far as I understand you drive paid traffic, where do you send the user and how do you "make" them to click the specific result from the SERP?

Matt Diggity 👑
Hit me up. I have some non-public tactics

James Dooley 👑✍️ » Matt Diggity 👑
messaged you pal
Eddie » Matt Diggity 👑
I would like to know more about this. Thanks

I've used Panda Bot with some good results.

It worked for some time, now it does not.
Ryan » Pablo
I checked recently and they had very few browsers online
Pablo » Ryan
Yeah, and he stole all my credits, 1 year of premium. That software it's useless now.
Ryan » Pablo
Same. Lost a lot of credits with them with barely any notice.

I've seen a few on CodeCanyon that there is decent CTR bot programs that look good compared to web based public ones.

This may satisfy you: a Share – How I Created SEO Content or High Quality (HQ) Content

Discussion 2: Is Click Through Rate (CTR) a ranking factor in SEO?
Is Click Through Rate (CTR) a ranking factor in 2020?
13 👍🏽 1 💟 14
[filtered from 55 💬🗨]


It doesn't make sense in 2020 just as it didn't before.
Content is the one competing on Search Engine Result Pages (SERPs), not someone's (in)ability to craft good, clickable metas. So no, I'd be very surprised if it would ever be a ranking factor for algorithms.
Of course, good SERP appearance opens up more chance for Engagements to demonstrate their role in rankings.
Low CTR at visible positions happens either if the SERP features of the result are not written well, the nearby competing results are much more appealing, or there is low trust in the domain for some reason.
Optimizing CTR is fine-tuning of Search Engine Optimization (SEO), because it happens AFTER ranking factors pushed the result to be visible for anybody to click it.
Never was for general Web search.
They use it for personalization and Pay-Per-Click (PPC).

You mean organic personalization?
Martinez » Adrian
I mean they *MAY* change the search for you personally as you click on stuff and change queries. They call that "personalization".
But if someone else comes along and runs the same queries they see something different, and their results may change in a different way – based on what they click on or don't click on.
Personalization is sometimes called "contextual search". It's also influenced by device, location, time of day, language, and other signals.
Adrian » Martinez
So in a way it does affect rankings :)) don't you think that stats from multiple 'personalized' results across the web over time affect the general results as well? In an icognito tab let's say?
Martinez » Adrian
If by "rankings" you mean what everyone is likely to see, no, Click Through Rate (CTR) doesn't affect them.
If by "rankings" you mean what 1 out of 3 billion people see then, yes, CTR *CAN* affect rankings (temporarily and it's not guaranteed to happen).
If you really feel like that is significant, go with it.
Adrian » Martinez
I still don't understand why Google wouldn't use this very relevant statistic.
You say 1 out of 3 billion but it's not quite like that. It's more likely 1 out of whatever number of searches a particular keyword has.
And it a keyword has 100 searches and 99 users keep clicking the second result… that definitely says something about the result.
I've seen this bot called pandabot roaming around people using it because it works. Mostly related to CTR. The bot uses localized IPs to click your site and then the overall rankings get a boost.
Martinez » Adrian
"I still don't understand why Google wouldn't use this very relevant statistic."
Because every Web marketer in the universe believes it should be manipulated.
Adrian » Martinez
So if everyone does it anyway… why not use it anywayxD? it's the same as nobody uses it
Martinez » Adrian
No, it's not the same. They're not interested in rewarding spammy marketing techniques. They want to deliver the best possible search results (by their measures, not yours or mine) to their searchers. They may suck at it but using Click Through Rate (CTR) would make their search results even worse.
I knew someone whose company ran a clickbot network in the 1990s. They easily manipulated DirectHit's "organic" search results for any client who paid them enough. For 1-2 years DirectHit was used by many search engines because they foolishly believed it was a good ranking methodology.
They found out the hard way that CTR is just not a reliable ranking signal.
Adrian » Martinez
But was't that slideshow from a Google official or some engineer proving that Click Through Rate (CTR) is a metric? There was another one saying that "we switch the results" at some point if I recall correctly.
Sure they won't officially state it's a ranking factor but it does affect search results, that's my take on it. Today everything's personalized. So if it affects personalized results… it affects everything.
Back in 1990 the alto wasn't so advanced… blended in today's complex algo CTR might only very slightly impact them and could be verified if it's fake or not.
You are clutching at straws. Google conducts experiments using click data. That is not the same as a ranking signal.

Martin Splitt said: it is not a ranking factor but we use it in our A/B tests. Google rolls 1000s of updates regularly and this affects only parts of the search queries at first. They check user metrics like CTR for those AB tests to verify the quality of those updates. But Martin explicitly accented that it is not a ranking factor you can optimize for.
But hey, that is just Googles statement 🙂 we all know if they "admitted" to it being a factor SEO's would be sending bots all over the place

They already do, there are a lot of tools that do this, and they work wonders it seems


I haven't tested in 2020. I did test's back in 2016 and before, and at that time I found that it did most probably have an effect. But I did not only test with the exact match itself, but variations and plus brand name. I believe (have no evidence) that it's CTR plus low "jump back to Google and search more" that is the key. It also shouldn't be too many clicks on any given Search Engine Result Page (SERP) on any given day. Consider the ca number of searches per month, and adjust it a bit down. The page ought also rank on page 2 or higher. It seemed higher CTR increased the ranking by some. But I haven't ever done a fully isolated test, as I haven't had chance to or prioritized it.
Regarding what Google says, I don't trust them when they talk about ranking factors. 😉
My theory would suggest .. maybe.
For Click Through Rate (CTR) to be a factor, there would have to be statistical significance in the data captured.
So the term would have to have enough search volume to build some kind of significance.
That's further complicated by – if you're in position 30 – you'll get little traffic unless it's a term with huge volume.
So Imo you can discount all terms which have low volume or those which you rank poorly for. There's not enough data.
How about terms with large volumes which you rank well for? They're certainly possible.
But again, mathematically, all you can test for is whether it's a higher or lower ctr for other sites at your position. It's going to take a long time to build that data up to significance.
So … to me the answer is still "maybe".
For user data, if I was Google, I'd be more interested in the "quick bounces" from users – those who go onto your site and bounce off almost immediately. That's data could be much more reliably used imo.
That would be a clear sign that it's not related to the term.
I imagine ctr is somewhere in the machine learning system though as I think Google will try to predict the ctr you'll receive, and that's useful data to use in order to rank sites.
Yes it's THE ranking factor.
Optimise on page and off page all you want but if Click Through Rate (CTR) and Conversion Rate Optimization (CRO) on your page are off in comparison to everyone else you won't rank.
Don't try and manipulate it with bots, Google is very good at bot detection.
Remember they make their money from Pay-Per-Click (PPC), so anyone trying to run bots to spend all their competitors ad budgets, Google knows what a fake click is like.
Instead do the following
1. If it's an informational query I am ranking for does my page answer the question.
If it's a transactional query do visitors do what I want then to do when they visit the page. Buy, click, shop or visit another page on my site.
2. Is my title tag optimised for CLICKS as well as Search Engine Optimization (SEO). You should be running tests to get the best optimised title tag.
"Is CTR a ranking factor in 2020?"
Again – it was NEVER a Google ranking signal for the general Web search results. It never will be.
They use it for the Pay-Per-Click (PPC).
They use it for personalization.
It's a dirty, noisy signal that a lot of people attempt to manipulate but even more importantly:
[1] Most listings never receive any clicks
[2] Searcher click patterns change over time
Because they cannot collect click data for most listings anyway it's useless as a signal.
All the SEO tests and experiments in the world will not change this.

Discussion 1: Updated: CTR Manipulation in Google and Youtube Search | Every algorithm made by humans can be bypassed or bent by other humans
"Every algorithm made by human, can be bypassed or bent by other human."
First of all… thank you for all your likes and comments, I really appreciate your
interest and I truly believe the value comes from helping each other in communities like this one in SEO Signals Lab 
Disclaimer: I got contacted by a bunch of you, along with few people from the group who warned me not to talk about this publicly because of a few reasons…
I used a few different softwares to test Click Through Rate (CTR) manipulation/crowd-search, but at the end of the day, there is still room for improvement for each one of them, but they worked. Some of them had horrible interface, buggy back-end problems and they were overall pain in the ass to use them, but hey…they got the work done.
I won't throw names which softwares I used because I don't endorse either one of them, but I will talk about strategy and what worked for me personally.
I specialize in Youtube Search Engine Optimization (SEO) and Video marketing, but I found out this can be used with either Google and Youtube for higher Search Engine Result Page (SERP) positions.
(with slight differences).
I'll try to speak as basic as possible, so more people can understand what am I talking about.
Let's start with Google, shall we?
How it all started:
So everybody who's doing Search Engine Optimization (SEO) is aware of Rank Brain and all the goodies Google updates every year to create a better User Experience (UX) for end user.
The one factor that interested me the most is User Interaction.
Now this one is pretty logical to me and before I even start talking about CTR manipulation, let me say this:
1) You need to have KILLER content with superb quality on your site you are trying to rank.
2) Do content research, find the best keywords and create content around them.
3) On-page SEO.
Can't stress this enough how important this is.
Take a peek at Matt Diggity's On-page guide, or maybe there is some kind of guide here in the group.
4)Domain shouldn't be brand-spanking new. Let it age a little and start progresivelly with link building, so it doesn't raise an alarm in Google HQ. 😉
5) Start Link building campaign, there needs to be some traction with your site.
Start slow if site is new, use link building velocity to spread links through the whole month.
6) Drip-feed Social signals through 30 days when doing crowdsearch.
Not blast, drip-feed.
It looks legit through the whole month when we start sending traffic to the site.
1) I personally used it in combination for Local SEO and affiliate sites.
It worked beautifully and it still does.
The monthly searches for keywords I tested (for now) weren't exceeding 5000 searches/per month.
2) Also it can drastically reduce bounce rate if you have a problem with that.
TIP: This method won't skyrocket you from Page 10 to Page 1 on Google.
The best you can do is try to come as close to at least page 3 and then you have a realistic shot to come to the first page for your desired keyword.
1) the majority of this kind of service providers offer big numbers;
Up to 1,000,000 visitors (usually around 10k-50k visitors) to your site for relatively cheap.
This is the first problem because there is manipulation threshold for every keyword.
That means if your desired keyword is searched aprox. 7-9k times per month, the worst thing you could possibly do is to send double or even more traffic to your site.
Google picks up all the searches around that keyword and creates a data, but if you try to send an unusual amount of traffic to rank faster, big G will pick it up and you don't want to create any kind of suspicious action regarding this risking penalty.
2) The other thing is you don't know the quality of traffic, because not all traffic is equal in Google's eyes.
I'll explain that below.
3) usually people send traffic through 1 'direct' URL which is a huge mistake and it looks super-unnatural.
4)When 'visitor' comes to the site, they usually sit dead on top of the site for some time and then they bounce, no movement.
Now I tested a lot of services on dummy sites provided on various marketplaces, and to be honest, there are very few that are actually beneficial to your site.
So I tried to do this by myself.
In order to execute this my way, I needed a software that can "store" virtual personas (like having real people waiting for my command) so I reached out to a VA and ordered a few thousand profiles to be created in Excel, including full NAP (full name, address, phone)+ Google accounts so they are logged it. (around 60% of the accounts).
After that, I started examining some patents Google holds regarding User Interaction and most of them date from 99' to 2005. (what I had found)
A lot of people are worried about canvas fingerprints that could leave a trail something is being manipulated in the Search Engine Result Pages (SERPs).
(like what kind of font, browser, desktop resolution etc. user is using to browse through Google to reach to desired website)
I never had any problem with this as long you don't send big batches of traffic to your site, but just in case you are worried (and for future Google Updates):
I used iMacros (programmed pre-recorded movemets) so 'visitors' come to site through various browsers (Chrome, Mozilla, Opera…) and some browsers allow you to install add-ons that cover/mix your Canvas
(there are a bunch of them, just Google it) so you are secure and you don't have to worry about that.
The imacros paths I used are driving traffic to sites using:
1)Organic traffic-visitors search through pages on Google with your target keyword until they find your URL and then they click on it.
2)Direct traffic – most popular one, they type your URL in search bar and go to your site.
3)Referral traffic – they click through link on your social media profile or Private Blog Network (PBN) or any kind of external link pointing to your money site (extracting maximum link juice from that link if Do-follow because there is traffic flow through that link)
4) Pogo-Sticking search – visitor bounces through few higher ranked sites until it reaches yours. (this signals Google that sites ranked above yours are not what they are searching for ,until they land on your site.
*All the links used in pathway needs to be indexed in Google.
* For Organic Traffic, the targeted site needs to be in first 3 pages in Google.
* 45% traffic comes through referral, 30% through direct and the rest is through organic search. (aprox.)
Now of course, visitors land on the desired URL, they browse, click and all the good pre-recorded stuff Google values and then they bounce off after 5-7min (dwell time) or as long you want them to stay.
I personally make sure around 20% of logged-in Chrome user return to the desired URL within 2 weeks, sending Google strong signals about the site, getting even higher in Search Engine Result Pages (SERPs). (use Sticky Residetial proxies for this, more about it below)
*Send visitors through 30 days period
(always drip-feed to look normal)
The main problem why Google penalizes people doing this is because in order for this to succeed, you need to use Real residential IP address- which come with a price. 
And yes, if using this method for Local SEO, try to get 0 Spam Residentials that are Geo-targeted from real ISP.
(meaning if you are in Germany, using german residential proxies your search console will show 'real-as-it-gets' traffic from Frankfurt, Berlin etc. depending where they are located in Germany)
Those visitors go through your site, scroll, interact, click through inner pages, on links and all the good stuff…and everything is recorded by big G.
It doesn't get any better than this and I hope you managed to follow me through the text…
I probably forgot to address all the things here… but I want to hear you in comments below.
If you'd like me to write another method similar to this, Like this post.
P.P.S. If you read till the end, kudos to you!
P.P.P.S. Because of some people's wish, this post will self-destruct in a few days. 🙂
64 👍🏽 5 💟 70

updated ctr manipulation in google and youtube search every algorithm made by humans can be bypassed or bent by other humans

Leave a Reply

Your email address will not be published. Required fields are marked *