The story is familiar. A brand runs radio for a few months, doesn't see a clear line between spots and sales, and starts asking whether the budget could work harder somewhere else. Digital is measurable. Radio is harder to track. The obvious conclusion is to shift the money.
A quick-service restaurant brand went through exactly that process. Short-term attribution testing hadn't made a clear case for radio. The plan was to cut it. Before pulling the budget, they brought in media analytics firm Seeda to run a 60-day media mix modeling engagement. What came back reversed the decision entirely.
What the Model Actually Found
Seeda integrated data across every channel in the media plan — Meta, Google Ads, AM/FM radio, influencers, CRM, and point-of-sale systems — into a unified model. The goal was to isolate how much revenue each channel was actually contributing, after accounting for seasonality and everything else happening at the same time. The model validated at 97% accuracy.
The cost-per-transaction numbers were not what the brand expected going in:
Radio delivered a $3.60 return on every dollar spent, ranking third highest among all channels measured — ahead of paid social, display, TV, and outdoor. Google Ads had a higher raw ROI number at lower spend levels, but the model identified the core problem with that comparison: Google's ROI only looks strong because most of that spend is capturing demand that radio and other brand channels already created.
Why Short-Term Testing Missed It
The reason the original testing hadn't made a case for radio is the same reason short-term attribution consistently undercounts radio across industries. Radio's effects persist for up to three weeks after a campaign ends. Weekly or monthly attribution windows close before the full impact registers. A spot that ran on Monday may influence a purchase decision made 18 days later. That purchase gets credited to whatever touchpoint was last before checkout, not to the channel that built the awareness in the first place.
Media mix modeling accounts for this lag. It measures cumulative transaction impact as spending increases over time, not just the immediate response window. That's the difference between what the brand saw in short-term testing and what the model found.
The Audience Alignment Problem With TV
The model also surfaced something the brand hadn't quantified before. MRI-Simmons audience data showed that heavy AM/FM radio listeners closely mirror the profile of frequent QSR customers: younger, employed, more likely to have children, and high vehicle usage. They're the audience.
Heavy TV viewers, by contrast, underindexed on QSR consumption. The brand had been spending on a channel whose heaviest users weren't their best customers, while underinvesting in the channel whose audience matched their buyers most closely.
What the Campaign Produced
Across the optimization window, marketing-contributed revenue lifted approximately 30%. The brand grew sales 5% year over year through a seasonally challenging period — with radio spending retained and increased, not cut.
What This Means for Local Advertisers
The QSR in this study is in Australia. The measurement principles are universal. The same dynamic plays out in every market where a local business has tried radio, judged it on short-term attribution, and moved the budget to something that felt more trackable.
Short-term attribution tools are built for direct response. Radio is a brand channel. Measuring it with the wrong tool produces a false negative. The question isn't whether radio generated a click — it's whether the people who heard it became customers. That's a different measurement question, and it requires a different methodology to answer.
For a Treasure Valley business running radio on any of our stations, the implication is straightforward: if you've run radio and concluded it didn't work based on immediate click or call tracking, you may have the right result from the wrong measurement. The business that ran Hank FM through the fall and saw 43% more inbound calls didn't know that would happen going in — they committed to the medium long enough for it to register.
Want to talk about measurement before you run a campaign?
We can walk you through what attribution looks like for businesses in your category and what timelines are realistic for seeing results.
Talk to a rep →Source: Seeda Media Mix Modeling case study, published by Cumulus Media / Westwood One Audio Active Group®, March 16, 2026. Audience data: MRI-Simmons. Model accuracy: 97% validation. Cost-per-transaction figures reflect a 2-month measurement horizon.