logo

Crawl Coverage Report

Analyse Googlebot's crawl activity without log files


Published: 10th July 2025
Feature
Crawl Coverage report featured image

We've introduced a brand new feature to our tool for our Scale and Company customers: Crawl Coverage report.

At Indexing Insight we’ve built an alternative Google log file analyser solution which allows our customers to understand Googlebot activity on key pages.

What is this new report?

The new Crawl Coverage report automatically groups pages into timeline buckets based on when Googlebot last crawled the page.

Crawl Coverage Report in Indexing Insight

What you can do with this new report:

This report is perfect for SEO teams who don't have access to log file data and get insight into how Googlebot is crawling your website..

Find it under Crawling → Crawl Coverage in your projects.

How does the report work?

The report groups pages into time buckets based on the Last Crawl Time from the URL Inspection API.

How the Crawl Coverage Report works in Indexing Insight

Google Search Console data can tell you exactly when Googlebot's primary crawler last crawled a page. This result is shown in both the URL Inspection tool and API.

However, the problem is that the Last Crawl Time is hard to analyse.

At Indexing Insight we use the Last Crawl Time date to create our own metric called Days Since Last Crawl. So, you know exactly how long it has been since Googlebot crawled a page. This new metric is calculated for every page URL that is pulled using the URL Inspection API.

We then use the Days Since Last Crawl metric to group pages into time buckets. This makes it easier for SEO teams to identify trends from Googlebot's own crawl data.

Learn more about the Crawl Coverage Report in our support documentation.


Join Indexing Insight to log in and start using this right away! Or book a demo for one of our team to take you through the tool.