Basking in the Traffic Gains: Our Content Refresh Early Detection System
I’ve made two posts this year about content refreshes. One was about traffic recovery, and the other was about refresh identification. Both of them were, admittedly, fairly wonky. I got lost in my SQL statements and graphs and invited you to come along for that ride with me, if you dared.
Not today, though. Today is the day that I heed the demand, “In English, poindexter!”
I’m going to show off the easy button for refreshing content and realizing substantial traffic gains. I can now do this because I’ve turned my queries and graphs into a dead simple, prioritized list of content to refresh. You can explore it for yourself here (click “refresh candidates”), and this is what it looks like.
This is a screenshot from the alpha offering of our content performance monitoring dashboard, which I announced back in June. (Beta coming soon!) The dashboard here features our content lab and community site, Make Me a Programmer.
If you’re unclear on what a content refresh is or why you should do it, let me explain. A content refresh involves making updates to an existing post or article on your site. As for why you should do it, let me present an actual anonymized field study of the impact on traffic (refreshes executed on this group of URLs at the red dot).
If you’re sold on the concept, I’ll spend the rest of the post explaining how we help you do it, using our tooling and refresh-candidate identification methodology.
Understanding the Setup
First things first. While I promised not to get into the weeds of the data, it is worth understanding where the information is coming from. I’m creating this refreshes view using four tools:
- Google Analytics 4.
- Our proprietary Airtable traffic modeling base.
- An (awesome) tool called Panoply to ingest the data into BigQuery.
- Looker Studio (though we’re considering moving to Metabase for beta) to display the data.
The main reason to understand this is to know what kind of access you’d need to furnish us if you want us to set this up for you (mainly your analytics tool).
Understanding the Heatmap
With that out of the way, let me walk you through what you’re looking at.
- URL is the URL of the potential refresh candidate that we’re talking about.
- Primary Keyword is the main keyword (singular) you created the URL to rank for.
- Rank is the most recent position in search for that URL and keyword combination.
- Last Changed is the last time you modified the content at that URL in some fashion.
- Projected is the amount of traffic our proprietary model projects you should earn for the primary keyword.
- Peak Monthly is the most traffic the URL has ever received in a calendar month.
- Last Month is the traffic to the URL in the last 30 days.
- Potential Gain is the greater of two things: the difference between the last month and projected, or the difference between the last month and the URL’s peak month. But more to the point, it’s a ballpark of how much traffic you could realize by properly executing a refresh.
And finally, the slider control at the top is a filter. In the screenshot above, where it’s set to 6, it filters out any URLs that have been modified in the last six months.
So what you’re looking at is basically a prioritized todo for refreshing content. It’s showing you anything that has had a chance to perform since last modification but is either declining in traffic or has serially underperformed against expectations. And we present it in descending order of the difference between expectations and reality.
The Gap in Most Refresh Processes
Now that you know what you’re looking at, let’s talk a little about the why. Why is this so effective?
To understand that, consider that there are four main reasons to refresh content, the first three of which are related to SEO.
- Underperformance: you wrote an article to rank, and, due to the mercurial nature of “the algorithm,” it just never ranked for some reason.
- Decline: you wrote a banger a while back, it rocketed to #2, and it earned you tons of traffic. But it’s been losing traffic lately.
- Limited Shelf Life: you rounded up the best ChatGPT use cases for 2024 and will need to update every year.
- Editorial: there is outdated information in the content and that matters for a distribution channel you have in mind.
With those reasons in mind, let’s consider when and how most companies would notice that a URL on the site met the condition.
- Underperformance: they wouldn’t.
- Decline: they wouldn’t, unless the URL accounted for most of their traffic or they were deep-diving on a site-wide traffic falloff.
- Limited Shelf Life: they maintain a list of articles that need to be updated each December.
- Editorial: they were preparing to distribute the article.
To overcome the clear gap here, SEO-savvy content ops people will establish a content refresh protocol that they go through periodically. That might involve mass updates to old content on a rolling basis. Or it might be more surgical, like using a rank tracker to identify URLs ranking in middling positions and refreshing those articles.
Filling the Gap and Reaping the Rewards
But crucially, a rote approach with limited data has two major gaps:
- Content that simply never performed (but could) is a blind spot.
- Noticeable loss of rank for an article’s primary keyword typically happens after months, or even years, of gradual traffic decline.
Our dashboard addresses both gaps.
For non-performing content, it will show up in the view after it’s had a chance to make some noise out of the gate. In other words, if you target a keyword with an article today and it underperforms, you’ll see it in that view six months from now instead of losing it to the dustbin of history. You can then evaluate what went wrong and try to course correct.
This has two benefits: one obvious and one subtle. The obvious one is that sometimes just taking a second crack at the article is all it needs to start ranking. The subtle benefit is that regularly evaluating underperformers can help you stop creating underperformers.
For the gap around declining articles, you’ll start to see them in this view as soon as they’ve declined from a traffic peak. In the early going of a decline, you’re typically losing traffic for long tails (non-primary keywords you happen to rank for), and a simple touch-up refresh of the content tends to be very effective and thus high leverage.
Traffic decay is a natural part of the lifecycle of an SEO-minded post, but it’s also a highly preventable one. Think of early detection of traffic decay as the equivalent of having credit monitoring. You can intervene against SERP interlopers immediately, before things get out of hand. This creates a formidable moat around your articles and their positions.
Avoiding Waste and Risk
On the flip side of all of this, the dashboard view here also steers you toward ignoring URLs you’re better off ignoring and not futzing with.
New content can take up to a year to hit its peak ranking, depending on your site’s domain authority. Figure it’s about half of that if you’ve comprehensively refreshed content, and it’s maybe a quarter for a touch-up refresh. So if you modify a piece of content today and then get jumpy and decide to modify it again in a week, you’re probably wasting your time.
The refreshes view prevents that by automatically dropping the URL from the list for a time after you change it. A watched URL never boils. Or something.
The view also helps you by not showing you URLs that are gaining traffic and by putting very gentle declines at the bottom of the priority ladder. I can explain how this helps with risk mitigation with the simple, folksy aphorism “don’t mess with success.”
Some standard refresh processes (including our historical process) address this by not targeting URLs that are in the top three for their primary keyword. And while I do think that’s a smart heuristic, we’ve learned that it has a bit of a fatal flaw. We’ve seen a lot of URLs keep ranking well for their primary keyword while hemorrhaging 50% or more of their traffic.
Losing that much traffic is hardly success. You should mess with it.
What to Expect
To date, we’ve been using the refreshes view of the dashboard, and the dashboard in general, as enablement for client delivery. This precludes me from showing anything other than the shape of completely anonymized data. But we’ve been doing a lot of this across a lot of clients.
The results we see from this detection and execution methodology tend to result in short-term gains of 50% to 300% in traffic to the URLs in question. Sometimes they exceed the peak traffic of the combined URLs, even. And the gains tend to be fairly durable.
I’m hoping to do an exercise in refreshes for hitsubscribe.com and makemeaprogrammer.com and share some actuals, for better or for worse. But given that I’m not exactly made of time, I might not get around to that unless I receive a decent outpouring of interest.
It’s also very much worth pointing out that the overwhelming majority of my data, thus far, is from Hit Subscribe detecting and executing the refreshes. I have very limited data on traffic for self-serve alpha users to this point.
So, caveat emptor if you use our dashboard and self-serve on refreshes. Though again, with enough interest, I can write up a post on DIY content refreshes for you non-fulfillment alpha folks.
Finally, if you’re interested in more information about the content performance monitoring offering while it’s still a free alpha (before it becomes a paid beta), feel free to inquire at dashboard@hitsubscribe.com.
Interested in More Content Like This?
I’m Erik, and I approved this rant…which was easy to do since I wrote it. If you happened to enjoy this, I’ve recently created a Substack where I curate all of the marketing related content I create on different sites.
Totally free, permanently non-monetized, and you’re welcome to sign up. Click here or fill out the form below: