A quick-service restaurant brand ran radio for a few months, didn't see a clear line between spots and sales, and planned to cut it. Before pulling the budget, they brought in media analytics firm Seeda to run a 60-day media mix modeling engagement. What came back reversed the decision entirely.

What the Model Actually Found

Seeda integrated data across every channel in the media plan — Meta, Google Ads, AM/FM radio, influencers, CRM, and point-of-sale systems — into a unified model. The goal was to isolate how much revenue each channel was actually contributing, after accounting for seasonality and everything else happening at the same time. The model validated at 97% accuracy.

The cost-per-transaction numbers were not what the brand expected going in:

$11
AM/FM Radio cost per transaction
$20
Google Ads cost per transaction
$58
Meta cost per transaction

Radio delivered a $3.60 return on every dollar spent, ranking third highest among all channels measured — ahead of paid social, display, TV, and outdoor. Google Ads had a higher raw ROI number at lower spend levels, but the model identified the core problem with that comparison: Google's ROI only looks strong because most of that spend is capturing demand that radio and other brand channels already created. That distinction between creating demand and converting it is what standard digital attribution — which gives all the credit to the last ad someone clicked before buying — is structurally unable to see.

Free Download

The Boise Business Advertising Guide

A practical guide to planning and budgeting local advertising in the Treasure Valley — channels, costs, and how to evaluate what's working.

Why Short-Term Testing Missed It

Radio's effects persist for up to three weeks after a campaign ends. Weekly or monthly measurement windows close before the full impact registers. A spot that ran on Monday may influence a purchase decision made 18 days later — credited to whatever touchpoint was last before checkout, not to the channel that built the awareness.

Media mix modeling accounts for this lag. It measures cumulative transaction impact as spending increases over time, not just the immediate response window. That's the difference between what the brand saw in short-term testing and what the model found.

The Audience Alignment Problem With TV

The model also surfaced something the brand hadn't quantified before. MRI-Simmons audience data showed that heavy AM/FM radio listeners closely mirror the profile of frequent QSR customers: younger, employed, more likely to have children, and high vehicle usage. They're the audience.

Heavy TV viewers, by contrast, underindexed on QSR consumption. The brand had been spending on a channel whose heaviest users weren't their best customers, while underinvesting in the channel whose audience matched their buyers most closely.

What the Campaign Produced

Across the optimization window, marketing-contributed revenue lifted approximately 30%. The brand grew sales 5% year over year through a seasonally challenging period — with radio spending retained and increased, not cut.

~30%
Lift in marketing-contributed revenue during optimization
+5%
Year-over-year sales growth through a challenging seasonal period

What This Means for Local Advertisers

The QSR in this study is in Australia. The measurement principles are universal — and the same dynamic plays out along the fast-casual corridors on Eagle Road and Fairview Ave every lunch hour. The same dynamic plays out in every market where a local business has tried radio, judged it on short-term tracking, and moved the budget to something that felt more trackable.

Short-term measurement tools are built for direct response. Radio is a brand channel. Measuring it with the wrong tool produces a false negative. The question isn't whether radio generated a click — it's whether the people who heard it became customers. Modern radio measurement approaches — from branded search monitoring to multi-touch platforms — are more accessible than most local advertisers realize.

For a Treasure Valley business running radio on any of our stations, the implication is straightforward: if you've concluded radio didn't work based on immediate click or call tracking, you may have the right result from the wrong measurement. The business that ran Hank FM through the fall and saw 43% more inbound calls didn't know that would happen going in — they committed to the medium long enough for it to register.

Want to talk about measurement before you run a campaign?

We can walk you through what measurement looks like for businesses in your category and what timelines are realistic for seeing results.

Talk to a rep →

Source: Seeda Media Mix Modeling case study, published by Cumulus Media / Westwood One Audio Active Group®, March 16, 2026. Audience data: MRI-Simmons. Model accuracy: 97% validation. Cost-per-transaction figures reflect a 2-month measurement horizon.