Solved: Blogspot Posts Not Indexing? The Critical Robots.txt Mistake (Update #6)
We have a major breakthrough in The Blogspot Experiment. If you are currently tearing your hair out because of Blogspot posts not indexing, you need to stop what you are doing and read this update immediately.
If you've been following along, you know I've been waiting for Google to index my new posts. I thought it was just the "Google Sandbox" period. I thought I just needed to be patient.
I was wrong.
It turns out, I made a classic beginner mistake that was effectively blocking Google from properly understanding my site. If your Blogspot posts not indexing issue is accompanied by "URL is not on Google" and your Search Console data is full of "N/A" errors, the problem isn't your content—it's your settings.
The Symptoms: "N/A" Everywhere
I went to Google Search Console > URL Inspection and inspected one of my recent blog posts. I expected to see the standard "Crawled - Not Indexed" status, which usually just means you need to wait.
Instead, I saw something much worse:
URL is not on Google
Referring page: None detected
Last crawl: N/A
Crawled as: N/A
Crawl allowed?: N/A
Indexing allowed?: N/A
Seeing "N/A" for Crawl allowed? is a huge red flag. It means Googlebot tried to check my instructions (the robots.txt file) and failed completely.
What is Robots.txt and Why Does It Matter?
Before we fix it, you need to understand what went wrong. The robots.txt file is the "Gatekeeper" of your website. Before Googlebot looks at a single blog post, it must look at this file first. This file tells the bot: "You are allowed to enter here" or "Stay away from there."
If this file is broken, blank, or blocking access, Googlebot will simply turn around and leave. This is the most common technical reason for Blogspot posts not indexing.
For a deep dive into how this file works, you can read Google's official introduction to robots.txt.
The Cause: The "Custom Robots.txt" Trap
I immediately went to my blog's root URL to check my robots file: https://theblogspotexperiment.blogspot.com/robots.txt.
It was completely blank.
This is bad. A blank robots.txt file confuses search engines. It also meant my Sitemap wasn't being declared, which explains why GSC said "No referring sitemaps detected."
The Mistake: In my eagerness to set up the blog "perfectly," I followed some bad advice online. I went into Settings > Crawlers and indexing and enabled "Enable custom robots.txt"... but I hadn't configured it correctly. I essentially put up a "Keep Out" sign without realizing it.
The Solution: Reset to Default
If you are facing the issue of Blogspot posts not indexing, the solution is often to do less, not more.
I immediately went back to settings and turned OFF "Enable custom robots.txt".
As soon as I did that, Blogspot restored the default, correct file. Now, when I visit my robots.txt link, I see the perfect code:
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Disallow: /share-widget
Allow: /
Sitemap: [https://theblogspotexperiment.blogspot.com/sitemap.xml](https://theblogspotexperiment.blogspot.com/sitemap.xml)
This default code does two critical things:
Allow: / tells Google "You are allowed to crawl this site."
Sitemap: It explicitly tells Google where to find my map.
The Fix: How to Check Your Blogspot Settings
Do not touch the "Custom robots.txt" setting unless you are an advanced SEO expert. Blogspot's default settings are already perfect for 99% of users.
Here is how to fix it:
Go to Blogger Dashboard > Settings.
Scroll down to the Crawlers and indexing section.
Ensure "Enable custom robots.txt" is gray (OFF).
Ensure "Enable custom robot header tags" is also OFF (unless you specifically need to Noindex a page).
What I Did Next (And What You Should Do)
Now that the door is officially "open" for Googlebot, I need to invite them back to solve this Blogspot posts not indexing problem once and for all.
Request Indexing: I went back to Google Search Console, inspected the URL again, and clicked "REQUEST INDEXING." Now that the robots.txt is fixed, Google should be able to crawl it successfully.
Wait: It might take a few days for Google to refresh its cache of my robots file and crawl the posts.
The Lesson: Sometimes, "optimizing" everything leads to breaking things. In Blogspot SEO, the default settings are often your best friend.
I will update you in the next post to see if this fixed the "N/A" errors and finally got us indexed!
This post is part of a live, public case study: The Blogspot Experiment.



Comments
Post a Comment