Indexing issues with Google Search Console? Here's why

I’ve had a few internal conversations with fellow SEO specialists on the team about clients having challenges with page indexing through Google Search Console.

So I thought this would make a great topic to post on the forums about.

This is a relatively common issue from what I’ve experienced, so if you’re having these problems, know that you’re not alone.

Let’s address the basics - what is indexing? Simply, it’s Google finding your website, scanning it, and adding it to it’s library (AKA Google Index).

When someone searches for “homes for sale in Jacksonville”, let’ say - Google will know your site focuses on that topic, and may show it in the search results.

So why might you experience indexing problems?

1. Your website is new - simply put, it may take Google a few days or weeks for your site to be indexed.
2. Poor website structure - If your website structure is haphazard, bots may struggle to find it.
3. Robots.txt or Noindex - these tell search engine crawlers which pages they should or shouldn’t index - if you’re not sure about these, speak to a developer or your support team.
4. Slow load times or poor mobile optimization - these are key ranking factors for Google, and if they’re not optimized, this could be another factor.
5. Too few backlinks - too few backlinks (from reputable sources) can also cause indexing issues.
6. Low quality or duplicated content - If you have heaps of duplicated or thin content, Google may opt to not index it.

From my experience, more established websites often run into #6 as the major issue when it comes to indexing.

However, ensure whenever possible to submit your sitemap through Google Search Console to give yourself the best possible chance to index the right pages.

With all of this being said, if you found this helpful or if you have a specific question, I’d love to chat!


Hey @WesMartin - I’ve been talking with @jhnyguitar about his site: He is having some struggles getting pages to index and rank well, and we were hoping you might have some helpful tips.

One of the main things he’s brought up is this message he’s getting for a lot of pages in his Google Analytics/Search Console: “Discovered - currently not indexed.”

Another thing he’s asked me to look at is why he’s not ranking well for “Indian Canyons Neighborhood Palm Springs”. A couple of competitors are ranking well for that, but he’s sitting around position 21 and believes it should be much doing much better.

I’m happy to provide additional information, I’m not exactly sure what we would need to dive into some of the troubles @jhnyguitar is running into.

Let us know if you have any thoughts.


1 Like

Hey @JayPhee, thanks for reaching out - fantastic questions, by the way.

Without getting heavily into the details about the specific pages on @jhnyguitar 's website, I remember seeing there was quite a few pages on his sitemap. This could contribute to #6 on my list about having duplicated content, even if it’s not verbatim, Google may be skipping the indexing process on page it deems “too similar.”

I know Google also has a crawl budget each month for every website, so you might be bumping up against that, as well.

Not every page on a website necessarily needs to rank, so intentionally no-indexing pages that you’re certain don’t need to rank would be a good start - just to free up some of the crawl budget.

Secondly, a few of the clients I’ve worked with had a good portion of blog content overlap, causing keyword cannibalization. This is where consolidating blog content into guides and 301-redirecting the consolidated pages might also make sense. Depending on the amount of blogs in question, this is a time consuming process to do, but I’ve seen it work remarkably well, if done correctly.

Because it’s relatively well-established website, I don’t think you’d be running into any problems with #1 on my list about it being “too new.”

For content that you need to rank well, backlinks are likely going to be a factor - so running a backlink campaign, which again, is time-intensive, can be 100% worth it.

Which brings me to ranking for “Indian Canyons Neighborhood Palm Springs.”

I’m sure you know this, but you can reverse search in Google to see which pages are ranking for specific keywords, in order, by using this query site: “Keyword”

Doing that with “Indian Canyons Neighborhood Palm Springs” shows no specific results, but “Indian Canyons Palm Spring” as the keyword brings up a handful of blogs that mention that secondary keyword. That tells me that the website mentions that neighborhood, but doesn’t have any ranking content specific to it.

That’s an easy next step - creating some written content specific to that neighborhood, and link it back to the Indian Canyons landing page. Because those two keywords are quite similar, if rankings grow for one of them, there’s likely going to be some carry-over to other keyword.

From my research in Ahrefs, many of the pages that are ranking high for that keyword don’t have many backlinks, if at all, which should this situation relatively easy to begin ranking better for.

I just threw a lot at you, so feel free to let me know if you’d like me to clarify or continue any of my thoughts - thanks again, Jason.

Hi @JayPhee,

I came across this article from Search Engine Journal that reminded me of our conversation earlier this week.

Here’s the TLDR:

This article was written by Google’s John Mueller.

Mueller suggest two main reasons for this issue: server capacity and overall website quality.

Server capacity refers to a website’s ability to handle extensive crawling, especially for large sites.

Overall website quality, on the other hand, is a measure of the site’s content, layout, design, speed, and other factors.

Mueller emphasizes that improving the quality of a website is more beneficial than just reducing the number of indexable pages.

I hope this helps!