u/BangCrash
Speeding up indexing of 2000 pages?
I run a home services site and about 6 weeks ago we rolled out 2000 pages of super targeted location pages targeting individual suburbs and different services within each suburb with each service+suburb being a individual page.
The creation of the pages went smooth enough and Google has seen the updated sitemap and already indexed a lot of the pages. Each page has 600-1000 words with unique-ish content on each page (spun text but high quality hand carved spun text).
The indexing went great at the start with 400+ pages getting indexed each 3-7 days, but now the indexing rate has tapered off to around 10-40 pages each 3-7 days. So with 700+ pages still to index its going to take near on 5 month to finish the indexing.
Google Search Console (GSC) is noting that these pages are Discovered – currently not indexed and that it will crawl the pages but is doing slowly so as to not overload my site/server.
But at 30 pages a week its not even close to overloading the capacity of my site.
So my question is how do I speed up the indexing of bulk pages? Is there a trick I can use to do it or do I just need to wait for G crawler to do it's thing?
Edit: I'll explain the spun text bit cos people are getting confused. I paid a very good copywriter to write 1000 words of exceptional content for one of my location pages. This text was tested on live pages and was ranking very very well. I then took this copy and ran it through a professional spintax platform to generate the nested spintax of all possible versions of the word/sentence/phase structure. After this our editor went through the spintax version to manually check and confirm quality, grammar, and structure.
This is not a $5 post spun from Fiverr. This is multi hundred dollar copy.
20 π¬π¨
π°π
These are the few things I'll start with and monitor
β’ Improve your internal linking
β’ Improve your website architecture (not more than 3 clicks away from homepage)
β’ Submit updated sitemap weekly
β’ Monitor your log file
1 & 2 are done
3 – why am I submitting the exact same sitemap weekly? (The pages and structure aren't changing at all)
4 – what log file are you referring to?
Lopsi
You're looking for a "trick" that just won't help you in the end.
There is very little practical difference with having 2000 low quality pages that Google has seen and refused to index, and having 2000 low quality pages that you've somehow forced into the index.
Neither will see any traffic and neither will generate any revenue.
I'm not sure what I'd do in your shoes right now once you've already launched this model, but if I was doing it start I'd have picked a more manageable number than 2000 and see how that went first. It would have likely been better to start with a few dozen pages targeting your highest priority/most profitable services or areas.
If you'd started with 20 pages you could have used real content, had money left over to link build to them directly, and not potentially be seen as a spam site with an overnight rollout of 2000 pages of low end stuff without the brand signals to carry it through to rankings/earnings.
The site is 5 years old. And already ranking well.
10 pages targeting short tail highly relevant high traffic all ranking well.
Around 100 hand written blog articles all with internal linking.
This strategy isn't for the major keywords. This is picking up heaps of location specific keywords that have super easy keyword difficulty and basically zero competition.
Edit: I should also add that the new pages that have been indexed are generating traffic with huge user intent (because of the super specific location-service that the user is searching for) along with generating enquiries and actual paid bookings.
So clearly the pages ain't shit of customer are booking and paying.
Edit 2: I actually did 30 hand written high value local-service pages 2 yrs ago like you suggested and these did exceptionally well. This was actually a catalyst to rolling out many more suburb-service pages.
The redirects from those original service-suburb pages to the new ones are working well. And the new additional suburbs and services pages are ranking and generating traffic.
ecommerceoptimizer
Hand carved spun content, what a lovely new description of shit. Your post answers your own question. Indexing is slowing because Google is choking on the shit you are trying to feed it. Expect it to slow dow more and more. Your site is a "your money your life" site, you are trying to force feed it shit and it is choking.
Search and download Google's search quality raters guidelines and focus on the sections that discuss E-A-T. These are not ranking factors but by making sure you site has all of the suggested components that impact E-A-T, your other pages will rank better.
But this also means handcarved spun content must be sliced and diced to the garbage can and pages with meaningful helpful content be posted instead.
Next Google the term topical relevance and learn about it. This is very important because as a service provider, if you want to rank high now and in the future, you must start establishing your business as a topic expert and not just another fly by night company with spun text on it.
Lol. You sound like you would rather that I produce 2000 pages of identical content cut and pasted with out any variation.
This strategy is for targeting super specific low difficulty keywords with almost zero competition.
Google is gonna prefer a tailored unique page over competitors duplicate page that only has a unique H2.
If that's what you think my point was you missed it 100%. Google and Bing both have recently stated that they can pick out Artificial Intelligence (AI) and spun content a mile away and give it next to no authority. Still some people are able to rank with garbage when their garbage is the best available. Anyone wanting to rank better has very little to do the take it away. Say whatever you want Google is already telling you what I and others have told you.
The sad part is suburb specific unique content is extremely simple to do which would give Google some motivation thd do a little more for you. Right now Google sees your site and says to itself, oh f*ck more of this, why am I wasting bandwidth indexing this stuff when there are millions of sites with valuable content pop that I could be indexing instead. I'll just index a little bit less this time. Good luck
BangCrash βοΈ
You want to point me in the direction of suburb specific unique content at scale
ecommerceoptimizer
Creating the content is on you but it isn't or shouldn't be that difficult. Many times service pages get more traffic than the homepage because they are the homepage for that area. What makes a suburb different from its neighbors besides just the name? Landmarks, school teams, and other long tail location specific info. So it is to your advantage to make they effort to not make it spun cookie cutter shite and instead add location specific info and assets. A picture of a landmark optimized with the filename and alt text, info about local sporting teams or local phone numbers or town services. Especially if the service area name matches the name on what you add. It doesn't need to be rocketscience but trying to run it as a bulk cookie cutter job is part of your issue. Start with the best service areas and work backward. You can half ass it and bulk produce it which might work, but makes you volatile to the parson that does make the right effort
BangCrash βοΈ
Most of that is already done.
The spun text I was talking about previously is around 600 words.
On top of that I'm scraping Wikipedia for suburb specific text and adding 200-300 words of that
Images with suburb specific file names and alt text.
Google maps for the location of the suburb.
Specific suburb name and zip code.
Interlinking to close by suburbs. And interlinking to the other services in the suburb.
Along with linking back to the major short tail keywords.
ecommerceoptimizer
Hmm, interesting, because the signal Google is sending is that it's not high quality.
I saw your other comment about the keywords having low competition. From this information, Id say it is worth looking at the big picture. Do you have a better business bureau rating and is it good? They report to Google. Are their complaints about your company elsewhere on the web? In the same vain, are there comments elsewhere on the web praising you? Do you have positive Google My Biz (GMB) reviews?
Your site is a Your Money or Your Life (YMYL) site, are you familiar with E-A-T and the best practices around it, the diff components they want on your site? If not, search for the searchraterquality guide. I know there are links on searchenginejournal. Its a long read, focus on the Expertise, Authoritativeness, Trustworthiness (EAT) examples and all of the different things it suggests that your site include.
How old is the site? I'm assuming you have the whole local citation thing done yelp, angie's list, and all of them?
How about site structure, speed and tech stuff? Ahrefs free site audit is great for that.
Are you utilizing page schema on those pages to drill down clearly for Google before they even crawl the page?
Ill look at it if you send me a link in chat, if you want. I tend to agree with you concerning having more and better text over competitors. That's is why I'm thinking this is indicative of a different issue.
BangCrash βοΈ
Not based in the USA so BBB isn't relevant.
Our Google My Biz (GMB) has a 4.7 star rating. Not enough reviews atm around 30 I'm planning on working to get more next year but 30 should be enough for now.
Site is 5 yrs old. Citations were done back in early days and I've not looked for a while but I'm pretty sure all the based are covered. Definitely all the major ones are. Name, Address & Phone Number (NAP) is consistent across everything.
Site structure changed 6 weeks ago when these new location pages were being set up. Specifically structure was changed to allow future growth into new industries and new cities/metro areas. However redirects were a massive focus as pages were ranking really well and wanted that to continue with the new structure. Pretty confident that all the redirects are correct.
Schema is done… I think. I'm a bit of a newbie to schema but using rankmath pro schema was set up for the whole site. Google Search Console (GSC) says that Frequently Asked Question (FAQ) schema is correct for the location pages and from running schema tests it seems to be correct enough.
Expertise, Authoritativeness, Trustworthiness (EAT) is something I've ignored tbh. Figured in my specific niche it wasn't super relevant as it's not a cool or sexy or even talked about niche. This may be worth a revisit however.
That's for the offer I'll PM you the url. Very much appreciated
BangCrash βοΈ
As I've previously mentioned in another comment the keywords I'm chasing with this strategy are almost zero competition keywords and very low difficulty.
The fact that my pages are 1000 words of similar but dfferent text should land me higher than competition that is 500 words of exactly the same text.
π°π
CustomerService
Try running a few URLs in URL Inspector and request indexing.
The status you mentioned means Google knows about the URL, but hasn't indexed it. Testing the live URL in URL inspector will show you if there's any technical reason delaying indexing.
I've had a look already and it's not a specific technical issue.
From the learn more button it basically says that Google is aware of the pages and will index them in time but isn't doing them all now because it's concerned that by doing so it'll overload the website.
The URL inspector says they are fine and nothing to stop them being indexed. However having said that I've not manually requested indexing for them. But from all I've seen they should index no problem
Did you check the live URL? That's where you'll see issues if they exist. If there are no issues (remember to check the rendering) then request indexing. Do that for a few and track the indexation status to see if it makes a difference. If it works then you have a tedious job to do. If it doesn't then I'm not sure what to do.
Discoverability isn't the issue. They're discovered via XML Sitemaps. So, is Google just being slow or is there a hitch?
When I've done this I've discovered that most pages have been indexed and the coverage report is out of date, but requesting indexing typically clears up any laggards.
Edit: Are these pages heavy JavaScript? Google has to visit twice and render JS to index the page. This is done "as resources allow".
Advice for Google takes too Long to Index my Pages Explored but not Indexed
To Rank a Website With Keywords of Chiropractor in the City or Any Location
When Should We Use? Google Analytics, GSC, GTM, Google Optimize, Google Data Studio