Admin by Admin

If Google doesn’t index your website, then you’re pretty much invisible. You won’t show up for any search queries. Given that you’re here, I’m guessing this isn’t news to you. So let’s get straight down to business. This article teaches you how to fix any of these three problems:

  1. Your entire website isn’t indexed.
  2. Some of your pages are indexed, but others aren’t.
  3. Your newly-published web pages aren’t getting indexed fast enough.

But first, let’s make sure we’re on the same page and fully understand this indexing malarkey.

What is crawling and indexing?

Google discovers new web pages by crawling the web, and then they add those pages to their index. They do this using a web spider called Googlebot. Confused? Let’s define a few key terms.

  • Crawling: The process of following hyperlinks on the web to discover new content.
  • Indexing: The process of storing every web page in a vast database.
  • Web spider: A piece of software designed to carry out the crawling process at scale.
  • Googlebot: Google’s web spider.

When you Google something, you’re asking Google to return all relevant pages from their index. Because there are often millions of pages that fit the bill, Google’s ranking algorithm does its best to sort the pages so that you see the best and most relevant results first.

The critical point I’m making here is that indexing and ranking are two different things.

How to get indexed by Google

Found that your website or web page isn’t indexed in Google? Try this:

  1. Go to Google Search Console
  2. Navigate to the URL inspection tool
  3. Paste the URL you’d like Google to index into the search bar.
  4. Wait for Google to check the URL
  5. Click the “Request indexing” button

This process is good practice when you publish a new post or page. You’re effectively telling Google that you’ve added something new to your site and that they should take a look at it. However, requesting indexing is unlikely to solve underlying problems preventing Google from indexing old pages. If that’s the case, follow the checklist below to diagnose and fix the problem.

Here are some quick links to each tactic—in case you’ve already tried some:

  1. Remove crawl blocks in your robots.txt file
  2. Remove rogue noindex tags
  3. Include the page in your sitemap
  4. Remove rogue canonical tags
  5. Check that the page isn’t orphaned
  6. Fix nofollow internal links
  7. Add “powerful” internal links
  8. Make sure the page is valuable and unique
  9. Remove low-quality pages (to optimize “crawl budget”)
  10. Build high-quality backlinks

1) Remove crawl blocks in your robots.txt file

Is Google not indexing your entire website? It could be due to a crawl block in something called a robots.txt file.

Look for either of these two snippets of code:

1 User-agent: Googlebot
2 Disallow: /

1 User-agent: *
2 Disallow: /

A crawl block in robots.txt could also be the culprit if Google isn’t indexing a single web page. To check if this is the case, paste the URL into the URL inspection tool in Google Search Console. Click on the Coverage block to reveal more details, then look for the “Crawl allowed? No: blocked by robots.txt” error.

2) Remove rogue noindex tags

Google won’t index pages if you tell them not to. This is useful for keeping some web pages private. There are two ways to do it:

3) Include the page in your sitemap

A sitemap tells Google which pages on your site are important, and which aren’t. It may also give some guidance on how often they should be re-crawled.

Google should be able to find pages on your website regardless of whether they’re in your sitemap, but it’s still good practice to include them. After all, there’s no point making Google’s life difficult.

Continue the next article —-> Comin Soon

Leave a Reply

Your email address will not be published.

error: Content is protected!!