CodeSteps

Python, C, C++, C#, PowerShell, Android, Visual C++, Java ...

Google Search Console – New Coverage issue detected for site

You may see this message, once you have submitted your website to Google for indexing. This message shows that some issues were identified while Google is attempting to Crawl the URL from your website. I am going to explain, how to fix this issue through this Article.

You may receive the below message from Google if any issues while crawling your website.

New Coverage issue detected for site <<your-site-name>>

To owner of <<your-site-name>>,

Search Console has identified that your site is affected by 1 new Coverage related issue. This means that Coverage may be negatively affected in Google Search results. We encourage you to fix this issue.

New issue found:
Submitted URL has crawl issue

[Fix Coverage issues]

Step 1. Log in to Google Search Console.

Assuming you are using a newer version of Google Search Console.

Step 2. Select the property where you are seeing the issues. And click on the Coverage link, which is under the “Index” group.

Coverage details will be displayed on the page. Scroll down, and look for the details of the Errors.

Step 3. You will see the “Submitted URL has crawl issue” Error displayed in the Details section.

Click on the Error to see more details. Search Console will open the Error with more details; Affected pages, Examples etc,./

Google Search Console - Index Coverage
Google Search Console – Index Coverage

Step 4. From the “Submitted URL has crawl issue” page,  you can see a number of pages that have issues under the “Affected pages” section. The list of affected pages can be found in the “Examples” section.

We need to go through each of these affected pages and verify what would be the crawl issue & fix the issue. Lets’ select one from the list. And you can continue the same with the rest of the affected pages.

Step 5. Once you have selected the affected page from the list; you will see a list of tools that appeared to fix the issue.

Google Search Console - Submitted URL has Crawl issue
Google Search Console – Submitted URL has Crawl issue

Mostly the crawl issues are related to the blocking. If you have a rule in the ROBOTS.TXT file, to block the page(s); Googlebot will fail to fetch and crawl the file. So first we verify whether any BLOCKING rules are in the ROBOTS.TXT file.

Step 6. Click on the “TEST ROBOTS.TXT BLOCKING” link; to open the Tester.

Currently, it is opening in the old version of Search Console.

Verify the rules mentioned in the ROBOTS.TXT file. If any blocking rule is applied; remove the blocking rule (if required) and click on SUBMIT button to apply the changes.

Click on the TEST button to re-verify with the applied changes. If there are NO issues with the ROBOTS.TXT file for the URL; you will see an ALLOWED message; next to the URL box which is the bottom of the Tester page.

Step 7. Another verification we need to do, from the “Submitted URL has crawl issue” page is;  click on the “VIEW AS SEARCH RESULT” link to verify whether our URL is appearing in the Search Results.

If you are seeing the URL in the search results; it should be fine; otherwise, either the URL is NOT INDEXED OR your website is not allowing (any restricted permissions on the URL) to access the URL OR the URL itself doesn’t exist.

If URL is NOT INDEXED, we will fix this in the next steps; the other ones, you need to fix from the website.

Once these are fixed, you will see your URL appears in the Search Results.

Step 8. Now we will verify with the “FETCH AS GOOGLE” tool. Click on it and Search Console will open, Fetch as Google page.

Currently, it is opening in the old version of Search Console.

Step 9. Now click on the “FETCH” button to fetch the URL. Search Console will display the fetch status in the list.

If the fetch is complete it will show, Complete as the Status; otherwise, it will show Partial. If the URL is not indexed; it will show the “Request for Indexing” button to submit the URL for indexing. You can submit the URL for indexing here or we will discuss this in the next step.

If any error, it will show the error under the Status; for example, the “Not found” message you see; if the URL is not found.

Step 10. Go back to the “Submitted URL has crawl issue” page, and click on the “INSPECT URL” link.

You will see the message, “Retrieving data from Google Index”. Once it is done, it will show the result on the “URL inspection” page.

If URL is already indexed; it will show the “URL is on Google” message. You no need to do anything now.

If you see the “URL is not on Google” message,  you need to submit it to Google for indexing. Before submitting for indexing, test whether the URL is working by clicking the “TEST LIVE URL” button. It will start with “Testing live URL”. Once testing is done, it will display the results in the “LIVE TEST” tab.

If you see, “URL is available to Google” message; the URL exists and we can submit it for indexing.

If you see the “URL is not available to Google” message; you need to work from your website to fix this issue. If you do not want to INDEX this URL, you can ignore it.

Go back to the “GOOGLE INDEX” tab; from where you can submit the URL for indexing (if you want); if the URL is not on Google. Click on the REQUEST INDEXING link, to submit the request to Google for indexing the URL.

You will see the message, “Indexing requested”.

Remember that, if the URL is not indexed; it doesn’t mean that, the URL has some errors. We have gone through the above steps on how to fix the errors.

Once all errors are fixed, for all the affected pages; you won’t see Crawl issues for your website.

Paul

Google Search Console – New Coverage issue detected for site

8 thoughts on “Google Search Console – New Coverage issue detected for site

  1. thanks for the help..! it helped to solve Very nicely done. Your show schedule gave me the info on some shows I was wondering about.
    I visited your web site today and found it very interesting and well done. I can tell you have put a lot of work into it.
    Thank you for the listing on your web page.

    1. Hello,

      It seems your code in robots.txt is blocking the indexed page. You need to re-verify the robots.txt and ensure NOTHING is blocking to access the indexed page.

      Thanks!
      Paul

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top