Help needed. Google stopped indexing new pages.
Hello. We have 2 issues and hope someone can provide insight.
(1) Google stopped indexing new pages about 10 weeks ago. New posts show in Google Search Console (GSC) Coverage Report under "Discovered – currently not indexed." It is a growing bottleneck of posts. But other search engines are picking up new content as expected.
(2) Also URL inspection does not seem to be working for any request. For example, asked for a crawl on an indexed URL on 8/31, still shows last crawl date of 8/26 (today is 9/17). Same for any request.
Previously no issues. Attempted to ask John Mu for advice on Twitter, but no response. Any thoughts on where to look? Or where to get help? Thank you for any suggestions.
Maybe crawl budget issues? How large is the site?
Under 200 pages
For such a small site it would be difficult to call this a bottleneck.
Question, have you been actively monitoring your GSC notifications? Fixing all errors? If yes, it could be a couple of different things.
The first is you're requesting indexing for low value content. The second is your website is relatively new and indexing can sometimes take longer even after an assessment period where indexing occurs rapidly.
If either of the above are not possible, Google is not immune to errors and sometimes reaching out to Google Search Console (GSC) support via the question mark icon for help, will address issues. Google will often not reply but fix issues they can identify. If they deem it to be the new or low value content, you have to be patient and monitor for and fix any GSC issues rapidly to gain credibility in combination with work on your content value.
Provide your URL and I'll share more…
Thank you for the answer. The site has been up for a while with 12k+ users last month. Google previously indexed the site rapidly as expected. But now posts are "Discovered – currently not indexed." 2 example URLs that perform on other search engines are here (pub this week) and here (pub 6+ weeks ago).
When reaching out to GSC support is there something we should say on the submission to get the right eyeballs on the issue? Thanks.
Thanks for passing along the URLs, I enjoyed the look and feel of your site! I did a quick audit and I can see a potential reason the pages are not being indexed. I see the website is built with Squarespace and this is negatively impacting performance. This could be the reason why pages are not being indexed as performance scores are very low. If you previously had no issues with indexing, Google is putting more emphasis on performance scores they may now be limiting pages to discovery only that don't reach a specified performance level.
Google will not assist you with this. I can assist with improving performance scores but you'd have to take me on as your SEO guy!
Sometimes it will take time to index URL, in this case, what we can do
👉 Index url again and again
👉 Add these URLs on the sitemap
👉 Use social media crawlers to indexing
Yes thank you, there is a current sitemap and social accounts.
Chek canonical tag, robots.txt, meta robots, nofollow attr in internal linking of such pages, Core Web Vitals (CWV) and server response time
Yes thank you, these are fine. CWV show "needs improvements" as do all Squarespace sites, but should not prevent crawling of new content?
For the pages that are not getting indexed, try the things mentioned below:
👉 Internal linking (use indexed pages and throw internal links towards not indexed pages).
👉 Run and Check Mobile-Friendliness (Use mobile-friendly test or test live URL in Google Search Console (GSC))
👉 Check your robots (are you blocking any resources like CSS/JS? Check that in the robots testing tool)
👉 Use social media and post your links (Twitter is good)
I have written a blog on the Google indexing issue based on my experience. Do check it out if it helps
Yes thank you, internal linking is standard practice. Pages are mobile-friendly. All SS sites use the same robots.txt. Many of the pages in question are already on Twitter, but we'll double check. Thx
25 out of 189 pages are not indexed. I used IndexCheckr to get this result. Does that match with what GSC reports?
I can send you the list of non-indexed pages. If they are recently added pages this may just be a question of time.
Thanks, we are focused on URLs accumulating under "Discovered – currently not indexed" on GSC coverage report. The issue started a couple of months ago. URLs typically indexed quickly. Note, the most recent URL was crawled (previously normal behavior), so this is a positive sign, but no change on the curious backlog.
The most recent URL was crawled, but no change on the backlog. Note, 90% of traffic is organic.
Again, Google is the only search engine having issues crawling these pages. Other than Google's documentation there is no meaningful insight on the "Discovered – currently not indexed" designation. Google says "wanted to crawl the URL but this was expected to overload the site." Evaluating this is outside our capabilities, how is this verified?
Issue #2 above — has anyone else ever experienced GSC URL Inspection not working? Certainly seems related to issue #1 as it occurred at same time. URL Inspect always functioned as expected. Now zero impact. Thanks for any ideas.
To check the indexing, go to the Google search engine and type "site: + domain name". Mine gives : "site:twaino.com". Then click on the "Search" button and on the results page, you will see all the pages related to your website that have been indexed.
Advice for Google takes too Long to Index my Pages Explored but not Indexed
The Solution for Webpages Marked as Crawled Currently not Indexed on GSC Google Search Console?
Speeding Up Indexing of Thousands of Pages
When Should We Use? Google Analytics, GSC, GTM, Google Optimize, Google Data Studio
The Lower Domain Authority, the Lower Crawl, and Index Budget?