Call or Text Today!
210-493-6193

I made a Google Sheet that does change detection for you based on two Screaming Frog crawls. I’ll tell you why that’s important.

Two problems frequently come up for SEOs, regardless of if we’re in-house or external.

  1. Knowing when someone else has made key changes to the site
  2. Keeping a record of specific changes we made to the site, and when.

Both can sound trivial, but unnoticed changes to a site can undo months of hard work and, particularly with large e-commerce sites, it’s often necessary to update internal links, on-page text, and external plugins in search of the best possible performance. That doesn’t just go for SEO, it applies just as much to CRO and Dev teams.

Keeping a record of even just our changes can be really time-consuming but without it, we often have to rely on just remembering what we did when, particularly when we see a pattern of changing traffic or rankings and want to know what might have caused it.

These things are people problems. When we can’t rely on other teams to work with us on their planned changes, that needs to be fixed at a team level. When we don’t have a system for listing the changes we make it’s understandable, particularly for smaller keyword or linking tweaks, but if we compare ourselves to a Dev team for example – a record of changes is exactly the kind of thing we’d expect them to just include in their process. At the end of the day, when we don’t keep track of what we doing that’s because we either don’t have the time or don’t have the commitment to a process.

We shouldn’t really be trying to fix people problems with tools. That said, people problems are hard. Sometimes you just need a way of staying on top of things while you fight all the good fights. That’s exactly what this is for.

This is a way to highlight the changes other teams have made to key pages, so you can quickly take action if needed, and to keep track of what you’ve done in case you need to undo it.

As a completely separate use-case, you can also use this sheet to check for differences between different versions of your site. Say, for the sake of argument, that you need to know the difference between the mobile and desktop versions of your site, or your site with and without JavaScript rendering, or even the differences between your live site and a private developer version you’re about to release. There are tools that offer change detection and cover some of the functions of this sheet, but I really like the flexibility this offers to check for changes between versions as well as over time.

Screaming Frog Change Detection Google Sheet.
Access for free.

//

What sites is this good for?

This sheet is for anyone who needs an idea of what is changing on a fairly large number of pages but can’t afford to pay for big, expensive change detection systems. It’ll work its way through around 1,000 key pages.

That said, 1,000 key pages stretches further than you would think. For many small sites, that’ll more than cover all the pages you care about and even larger eCommerce sites get the vast majority of their ranking potential through a smaller number category pages. You would be surprised how big a site can get before more than 1,000 category pages are needed.

That 1,000 URL limit is a guideline, this sheet can probably stretch a bit further than that, it’s just going to start taking quite a while for it to process all of the formulas.

So what changes does it detect?

This Google Sheet looks at your “new crawl” and “old crawl” data and gives you tabs for each of the following;

What’s that about keyword change detection?

On many sites, we’re targeting keywords in multiple places at a time. Often we would like to have a clear idea of exactly what we’re targeting where but that’s not always possible.

The thing is, as we said, your pages keep changing – you keep changing them. When we update titles, meta descriptions and H1s we’re not checking every page on the site to confirm our keyword coverage. It’s quite easy to miss that we are removing some, middlingly important, keyword from the site completely.

Thanks to a custom function, the Google sheet splits apart all of your title tags, meta descriptions, and H#s into their component words and finds any that, as of the last crawl, have either been newly added, or removed from the site completely.

It then looks the freshly removed words up against Search Console data to find all the searches you were getting clicks from before, to give you an idea of what you might be missing out on now.

The fact that it’s checking across all your pages means you don’t end up with a bunch of stopwords in the list (stopwords being; it, and, but, then etc.) and you don’t have to worry about branded terms being pulled through either – it’s very unlikely that you’ll completely remove your brand name from all of your title tags and meta descriptions by accident, and if you do that’s probably something you’d want to know about.

How do I use it?

Start by accessing a copy of this Google Sheet so you can edit it. There are step-by-step instructions in the first tab but broadly all you need to do is;

  1. Run a Screaming Frog crawl of all the pages you want to detect changes on
  2. Wait a bit (like a couple of weeks) or crawl the mobile, JavaScript, or dev version right away for comparison
  3. Run another SF crawl of the pages you want to detect changes on
  4. Export the internal_all report for both crawls and paste them into the “old crawl” and “new crawl” tabs respectively
  5. Wait a bit (like 30 minutes)
  6. Check the results tabs for changes
  7. (Optional) Import Search Console data to give “value lost” information for keywords you removed.

What do you think?

I hope you find this useful! I was really surprised by what Google Sheets was able to achieve, is there anything you think I’ve missed? Anything you would change?

Skip to content