Has Google been deindexing your pages recently?
This can be a frustrating issue for website owners, but it’s usually a fixable one. Here, we’ll take you through the basics of indexing and why it’s important, plus how to diagnose and repair issues if you find your pages have been removed.
What is de-indexing?
When a page is de-indexed that means that it’s been removed from the list of pages that Google serves in the search results.
The key thing to understand here is that when you search Google, you’re not searching the entirety of what’s available out in the wilds of the internet. Instead, you’re searching the list of pages that Google has included in its index.
Our article on how search engines work goes into more detail on how this happens, but in short, Google crawls your website and collects information about the type of content you offer and the search intent that content might serve. If it deems your page useful or high enough quality, it will add the page to its immense database.
And, conversely, Google may remove your page from that database. This is deindexing, and has a result similar to adding a “noindex” directive to a page or to your robots.txt file.
What is the effect of a page being deindexed?
If your page has been deindexed, it will not appear in the search results. Google will not retrieve it in its database for the relevant searches.
This will result in a loss of organic traffic to the page and may also be reflected in your keyword tools if it was ranking for key queries.
Why would Google deindex a page?
There are numerous reasons Google might choose to deindex a page. Some are intentional actions taken by website owners, while others result from algorithm updates or potential violations of Google’s guidelines.
Here are a few common reasons:
If Google’s team believes a page violates their guidelines—like using manipulative tactics to boost rankings—it might manually remove the page from its index.
Web developers and SEO professionals can use a “noindex” meta tag to tell Google not to index a particular page. This could be used for pages that are in progress or those not meant for public viewing. If this tag is mistakenly applied to a page you want to be indexed, it can lead to unintentional deindexing.
Server errors, incorrect robots.txt files, or issues with your website’s sitemap could lead to deindexing. These technical glitches might communicate to Google that a page shouldn’t be indexed, even if that’s not the intent.
Google strives to deliver the best possible content to its users. If a page provides thin, duplicated, or irrelevant content, it might be flagged and subsequently deindexed.
Google frequently updates its algorithm to improve search quality. Sometimes, these updates might affect how certain pages are indexed, leading to random deindexings.
If a domain expires or isn’t renewed, the pages associated with it can be deindexed.
Blocked by Web Host or CMS
Sometimes, the platform you’re using to host or manage your website might block Google from crawling your pages. This can lead to them being deindexed.
How to find page indexing issues on your website
Google Search Console
Check the Indexing section in Google Search Console to get detailed insights about which pages are indexed and which may have errors. Keep in mind that some of the items flagged here may be fine, and not actual errors, but it’s important to review each section. Sometimes recently posted blogs or other important pages are lurking under “Discovered – currently not indexed” or “Crawled – currently not indexed”, for example.
For any page listed, you can use the URL Inspection Tool for more details on the status and why it the page may not have been indexed. If it should be indexed and is eligible, you can submit it to the queue.
Site: Search Operator
By entering “site:yourdomain.com” in Google’s search bar, you can view all indexed pages from your site. This is a quick method to spot missing key pages from search results.
Review this text by adding robots.txt to your domain name, i.e. https://www.domain.com.au/robots.txt . This file provides important directives to search engines about which pages or directories to avoid. Always double-check to ensure no essential pages or directories are inadvertently blocked.
Web pages can have meta tags in their HTML. It’s important to ensure that no crucial pages contain the “noindex” tag, which tells search engines not to index that particular page. This accidental noindexing sometimes happens with incorrect settings during website setup and configuration.
These tags inform search engines which version of a page to treat as the primary one. Ensure they’re used appropriately so you don’t inadvertently signal that a page is duplicate content that should not be indexed.
Redirects guide visitors from one URL to another. It’s great to set up temporary redirects if you change a URL, so that while the old address is in the index, users who click on the link in search will still find the page. Redirected URLs themselves are usually deindexed, however, and the new URL and content will be considered on its current content.
Your XML sitemap acts as a roadmap for search engines, detailing which pages should be crawled and indexed. Make sure it’s both updated and correctly formatted, and remember to submit it to search engines like Google.
Page Load Time
A page that loads slowly may not be crawled or indexed as efficiently by search engines. Always prioritize optimizing your web pages for speed.
High-quality, unique content is favoured by search engines. Ensure your pages offer value to avoid issues with thin or duplicated content that might not be indexed.
Tips for fixing indexing issues
The specific repairs depend on why a page has been deindexed, but here’s a quick checklist for you.
Address Google Search Console Alerts
This may seem like an obvious one, but it’s a key place to get insights into why pages are not indexed. For essential pages, request a manual reindex through the URL Inspection Tool once you’ve made any necessary technical SEO or content updates.
Optimise Your Technical Configuration
If you’re unsure where to start, contact a local SEO agency to help you diagnose the issues. They can check:
- Robots and Tags: Your SEO manager can adjust your robots.txt file to unblock essential pages and remove any unintended “noindex” tags. They can also review your canonicalisation settings to make sure there aren’t any inconsistencies.
- Sitemap and Redirects: An SEO manager can also help refresh and resubmit your XML sitemap after significant website changes. Streamline redirects, ensuring they’re direct and point to intended, live pages.
- Server and Page Load: Your SEO manager can also coordinate with a developer to improve server response times and address server errors that could be causing indexing issues.
Boost Content Quality
Audit and ensure your website’s content is unique, relevant, and high-quality. Address thin or duplicated content by merging, improving, or removing it. You can also work with a content marketing agency to develop a long-term content strategy that provides the kind of optimised, high-quality SEO content that Google will index and serve in search.
Prioritise Mobile Optimisation
With mobile-first indexing, ensure all pages are mobile-friendly using responsive design and testing across different devices.