Book a demo

Book a demo

Case study

Automation cuts down time spent on reports by 90% for telecommunications leader
600+ data sources
"Adverity has been a game changer for our organization

Adverity makes data access simple. This frees our engineering resources to focus on more."

Blog / Marketing Mix Modeling Fireside Chat with Labelium: Key Takeaways

Marketing Mix Modeling Fireside Chat with Labelium: Key Takeaways

Marketers today face a paradox: they have more data than ever before, yet measuring true marketing impact has never been more challenging. With privacy regulations tightening and third-party cookies fading away, traditional measurement methods are becoming unreliable. How can marketers ensure their data-driven decisions remain effective?

That’s exactly what we explored in a recent fireside chat featuring Kristin Wozniak, Data and Insights Lead at Labelium, and Mark Debenham, VP of Growth Marketing and Marketing Operations at Adverity. This discussion unpacks the resurgence of Marketing Mix Modeling (MMM), its challenges, and the best practices marketers should follow to implement it successfully.

Read on for the key takeaways, or watch the full discussion below.

 

 

 

Why MMM is making a comeback

MMM isn’t new. In fact, it’s been around for decades. But with increasing privacy regulations and the decline of third-party cookies, it’s enjoying a major renaissance. "There's a lot of excitement around MMM these days,” says Kristin, “which is kind of funny because we're having these huge conversations about gen AI and robots taking over the world on the one hand, and on the other hand, we're talking about MMM, which has been around since the 80s."

As marketers lose access to granular tracking data, MMM offers a privacy-compliant way to measure ROI across channels. "I think there are really three or four things that are driving this resurgence,” adds Kristin, “The first one is the idea of privacy. The rules and regulations around data are getting stricter, as they should. And with the sort of decrease and eventual demise of the third-party cookie, a collection of the methodologies we've been relying on are no longer tenable in quite the same way."

Four key drivers of MMM's resurgence

The return of MMM isn’t just about privacy concerns. Several industry-wide shifts are contributing to its renewed relevance:

  • A push for more objective measurement. Marketers are growing wary of relying solely on platform-reported metrics, which can feel like "grading your own homework."
  • Increased complexity in marketing channels. It’s no longer enough to evaluate a single channel in isolation—understanding how different channels interact is crucial for success.
  • A need for trustworthy methodologies. While MMM isn’t perfect, its longevity means marketers understand its strengths and limitations, making it a dependable tool in an uncertain landscape.
  • MTA is more complicated without cookies. With third-party tracking becoming less reliable, marketers need alternative ways to measure attribution.

 

The hardest part? Data, not modeling

Many marketers assume the hardest part of MMM is the modeling itself. In reality, the biggest challenge is data readiness. "The actual MMM part is kind of the easiest part of the process,” emphasizes Kristin, “It's collecting the data and getting all the foundations right, which is by far and away the hardest part of the process."

A strong MMM requires clearly defined KPIs and internal alignment on those KPIs. If different teams define success differently, adoption becomes a major hurdle. You shouldn’t simply assume your data is already in good shape, as Kristin explains; "You start digging, and you realize that you have bits and pieces, or someone somewhere has it, but they've been pulling it together manually behind the scenes in an Excel sheet. That is typically where the MMM process can fall apart before it gets going."

 

Collaboration is non-negotiable

MMM isn’t just a data science project—it requires cross-functional buy-in from marketing, finance, and leadership. The best models don’t just deliver a report; they provide actionable insights teams can use. Without alignment, marketers risk investing in an MMM project only to have it sit on a shelf. Kristin shared a cautionary tale:

 

"We worked with a collection of folks who were building an MMM for an auto company, and at the presentation part of the process, the analysts came in and said, 'We did the work. The finding is that this particular amount of spend to support this car model isn't working, so we recommend you just kill all the spend behind this car model.'

The dealerships already had thousands of those cars on the lot. Thank you for recommending we don't put money behind this, but what happens to my business now when I have a hundred cars sitting here?"

 

The biggest mistake? Knee-jerk reactions

MMM outputs should be seen as a voice at the table, not an absolute truth. "You get results from an MMM. Let’s say you've overspent in one area, and it makes sense to optimize. That doesn't mean you have to run around immediately and start to move money from one pile to the other… Take a minute, take a breath. What is this actually saying? What are the circumstances by which this has been created? What is the right step?" says Kristin.

Think of MMM as an ongoing process, not a one-time deliverable. The best teams revisit and refine their models as new data comes in. MMM cannot be treated as a plug-and-play solution—it requires continuous iteration, collaboration, and a strong foundation from the start.

According to Kristin, to get it going from the very beginning, you need:

  • Analysts to know what they need to make it go.
  • Internal business leads -  who actually work with the client to know what questions the client has and what their concerns are.
  • Someone on the client side who's a point person to ensure they're there to answer questions about data or anomalies or anything else we see.

Kickoff calls and ongoing alignment are essential to success. Without them, MMM models risk becoming siloed projects that never integrate properly into decision-making processes. Ensuring that all stakeholders—from analysts to business leaders—understand how the model fits into broader marketing strategy is key to making it actionable.

The importance of holdback data

One of the simplest yet most effective ways to validate an MMM is through holdback data. As Kristin explains, "You build your MMM, you hold some data back right from when you build your model, and then you run your model on that holdback data and see if it matches reality."

Using holdback data ensures that models are not just fitting historical trends but can actually predict future outcomes. This prevents overfitting and helps marketers trust the model’s recommendations. 

"If you’ve overfit your model—you’ve built your model in such a way that it accounts for every single potential nuance in the data in the last three years or so—because it’s got so granular, it actually doesn’t really know how to predict forward-looking trends,” elaborates Kristin, “If you have overfit your model, you'll notice in that holdback data that your predictions don’t line up with reality in any kind of meaningful way."

Final Thoughts: 7 best practices for MMM success

MMM offers marketers a way to navigate today’s privacy-first landscape with confidence. But success depends on more than just data science—it requires the right mindset, collaboration, and a willingness to learn from the results. From talking with Kristin, here are our 6 best practices for MMM success:

 

  1. Prioritize collaboration. Align analysts, marketing teams, and leadership from the start.
  2. Start with the right KPI. If different teams define success differently, the model won’t be useful.
  3. Be honest about data readiness. Don’t assume the data exists in a clean, structured format.
  4. Don’t be afraid of the results. Avoiding tough conversations or masking data leads to poor decision-making. Acknowledging challenges early allows teams to course-correct effectively.
  5. Treat MMM as a living process. Expect iteration and refinement over time.
  6. Validate models with holdback data. This ensures the model isn’t just fitting historical trends but can actually predict future outcomes.
  7. Differentiate between what's possible and what's practical. If you ask a data team if they can do something, the answer is probably yes. Instead, ask what can be done within a realistic timeframe—like 12 months—to ensure feasibility and actionable insights.

Make insights-driven decisions faster and easier!

book-demo
Book a demo