Part of the series: WordPress Pre-Launch Technical Checks
One of the most frustrating situations after launching a WordPress site is realizing that search engines are still being told not to index it.
The site is live, everything looks correct, the client has approved it… and yet nothing appears in search results. When that happens, it’s usually not a complex issue. It’s something small that slipped through.
In many cases, the cause is a leftover meta robots configuration from the staging phase.
What the meta robots tag actually does
The meta robots tag lives in the HTML head and tells search engines how they should treat a page.
A typical staging configuration looks like this:
<meta name="robots" content="noindex, nofollow">
This prevents indexing and stops search engines from following links. During development, that’s usually intentional.
The issue is when that same instruction remains active after the site goes live.
Why this happens so often
Most WordPress projects go through at least one staging environment. Indexing is blocked on purpose while the site is unfinished.
The problem is that when everything moves to production, not all those settings get reviewed again.
Everything works, so it’s easy to assume everything is ready. Meanwhile, the site is quietly telling search engines to stay away.
What tends to go wrong
WordPress indexing setting still active
WordPress includes a built-in option to discourage search engines. If it remains enabled after launch, it can apply a noindex directive across the site.
SEO plugins keeping staging configuration
Some plugins allow indexing to be blocked during development. After migrations or domain changes, those settings can stay active without being obvious.
Template-level directives
In some setups, robots tags are controlled at template level. If those conditions are not updated, certain pages may remain non-indexable.
Mixed or conflicting signals
Sometimes it’s not a single issue, but a combination. For example, WordPress allows indexing, but the page outputs noindex, or the sitemap includes URLs that are blocked elsewhere.
These situations can be surprisingly hard to spot at a glance.
A quick check before launch
At this point, a simple review is usually enough. Check the homepage, a few key pages, and confirm that indexing is allowed where it should be.
It only takes a couple of minutes, but it avoids that moment of confusion after launch.
Why this belongs in a repeatable process
Meta robots issues don’t break anything visually. The site works, pages load, everything seems fine.
But they directly affect whether the site can be discovered. That’s why it makes sense to include this check as part of a standard launch process.
Where PreFlight fits in
PreFlight focuses on reviewing these kinds of technical details before a WordPress site is delivered or published. It’s about catching small misconfigurations before they become real problems.
If you want to run a quick check before launch, you can start here: https://preflightstandard.com/
Final thought
Launching a site while it still says “do not index me” is more common than it should be.
A quick review of meta robots settings is a small step, but it helps make sure the site is actually ready to be found.
Top comments (0)