Select Page
Why “We Didn’t See It Coming” Is No Longer Acceptable (When Signals Existed)

There’s a conversation happening in boardrooms that CFOs and COOs increasingly dread. It starts with a simple question and escalates quickly:

“Why didn’t we see this coming?”

A decade ago, the answer was straightforward: “It was unpredictable. External factors changed rapidly. No one could have anticipated it.” The board would nod, accept the explanation, and move on to discussing recovery plans.

That dynamic is shifting.

Today, when a quarter misses due to external disruptions, the follow-up questions are harder:
“Were there signals we could have monitored?”
“Could we have quantified the risk earlier?”
“Did we have time to adjust if we’d known sooner?”

And if the answers are yes, yes, and yes—the original explanation no longer holds. What was once attributed to unpredictable external forces becomes a question of internal capability: Were we looking in the right places? Did we have the tools to see this coming? This isn’t about assigning blame. It’s about recognizing a fundamental shift in expectations. As information becomes more accessible and technology makes external signal monitoring feasible, the standard for what constitutes “unforeseeable” is rising. The era when “we didn’t see it coming” was sufficient explanation is ending. In its place: an expectation that executives will monitor beyond their walls and anticipate disruptions before they materialize in quarterly results.

The Old Standard: External Attribution
For most of corporate history, quarterly misses due to external disruptions were treated as unavoidable. Planning tools focused on internal operations. Market intelligence was expensive, slow, and limited to periodic reports. By the time external signals reached executive awareness, they’d often already impacted results.
Under this paradigm, “we didn’t see it coming” was not just acceptable—it was understood. Boards recognized that executives operated with limited visibility into external forces. The implicit contract: you manage what you can control (internal execution), and we’ll accept that some external disruptions will surprise us.
This made sense when:
• Information about suppliers, customers, competitors, and markets was scarce
• Analyzing external signals at scale required prohibitive time and resources
• The lag between “event happens” and “executive learns about it” was measured in weeks or months
• No practical tools existed to connect external signals to specific company impacts
In that world, external attribution was reasonable. Disruptions genuinely came “out of nowhere” because the signals either didn’t exist publicly or weren’t accessible in actionable timeframes.

The New Standard: Anticipation and Accountability
That world no longer exists. Today, supplier issues appear in trade publications, regulatory filings, and competitor disclosures. Customer budget pressures surface in parent company earnings calls. Competitive moves are announced in press releases. Regulatory changes are telegraphed through government communications. Commodity price shifts are visible in futures markets and industry data. The signals exist. They’re public. They unfold in real-time.

This creates a new accountability framework. When a quarter misses and the root cause traces to an external disruption, boards and investors now apply a three-question test:
Question 1: Were the Signals Available?
Could someone, with the right monitoring infrastructure, have detected early indicators of this disruption?
If relevant news, filings, data releases, or announcements existed before the impact materialized—the answer is yes.
Question 2: Could We Have Quantified the Impact?
Beyond just detecting the signal, could we have estimated what it meant for OUR numbers?
If the signal could be connected to specific inputs (supplier costs, customer demand, production capacity, regulatory timelines) that drive company KPIs—the answer is yes.
Question 3: Did We Have Time to Adjust?
If we’d detected and quantified the disruption earlier, was there enough lead time to take action? If the gap between signal emergence and quarterly results was measured in weeks—the answer is yes. When all three answers are “yes,” the explanation shifts. It’s no longer “we couldn’t have known.” It becomes “we didn’t have the visibility we needed” or “we weren’t monitoring the right sources.”

And of course, there will still be surprises where the answer to these questions is “no.” But those instances will become less frequent as more companies adapt to monitoring the context their business operates within. That’s a different kind of accountability.

Real Examples: When All Three Answers could have been “Yes” Let’s apply this three-question test to actual 2025 cases (Note: Information used drawn from executive public comments, analysts meetings, investor presentations etc.) :
Brown-Forman: The Canada De-Listing
The Miss: Q3 FY25 operating income fell 25% ($93 million quarterly hit). Management cited Canada removing U.S. liquor from store shelves and accelerating consumer downtrading.
Question 1 – Were signals available?
Yes. Canadian liquor control board procurement trends were shifting weeks earlier. U.S. retailer point-of-sale data showed pack-size mix changes throughout Q2. These weren’t secret—they were visible in retail data and trade channels.
Question 2 – Could impact be quantified?
Yes. Canada represents ~1% of Brown-Forman sales, and procurement data would have quantified the revenue at risk. U.S. POS data showing consumer shift to smaller bottle sizes could have been translated into margin impact estimates.
Question 3 – Was there time to adjust?
Yes. Signals existed 8-12 weeks before Q3 results. Brown-Forman could have redirected Canadian inventory, accelerated promotions on larger bottle sizes, or pre-announced guidance adjustments.
The implication: This wasn’t unforeseeable. The signals existed, were quantifiable, and emerged with enough lead time to enable response. The gap wasn’t information availability—it was monitoring capability.

Acushnet: The Vietnam Manufacturing Transition
The Miss: $18 million in restructuring costs and “logistical challenges” from the footwear manufacturing move to Vietnam, disclosed in Q4 2024 and Q1 2025.
Question 1 – Were signals available?
Yes. Production throughput at the new Vietnam facility, quality metrics, and distributor lead-time trends were building throughout Q2-Q3 2024. These operational signals—both internal metrics that weren’t escalated and external distributor feedback—existed in real-time.
Question 2 – Could impact be quantified?
Yes. Production shortfalls, quality issues, and extended lead times translate directly into restructuring costs, lost sales, and margin pressure. The connection between operational friction and financial impact was quantifiable.
Question 3 – Was there time to adjust?
Yes. The challenges were building for months before disclosure. Acushnet could have delayed the China exit, accelerated technical support to Vietnam, or adjusted inventory buffers—preserving an estimated $7-10 million in combined costs and revenue.
The implication: The transition complexity wasn’t the surprise—it was predictable from operational signals. The gap was early-warning visibility into those signals while there was still optionality.

Bruker: The Q2 2025 Margin Collapse
The Miss: Q2 2025 operating margin fell to 9% (from 15.4% in Q2 2024), driven by weak academic/research demand and tariff costs.
Question 1 – Were signals available?
Yes. NIH and NSF grant award data, academic budget trends, and customer quotation pipeline metrics were all accessible. Trade publication coverage of research funding pressures was extensive in Q1 2025.
Question 2 – Could impact be quantified?
Yes. Research funding trends correlate directly with scientific instrument demand. Tariff cost impacts on component sourcing could be calculated from supplier pricing and bill-of-materials data.
Question 3 – Was there time to adjust?
Yes. Signals were building in Q1 2025. Bruker could have accelerated the $120M cost savings program, implemented temporary production curtailments, or pre-adjusted Q2 guidance—preserving an estimated $15-20 million in operating income.
The implication: Academic funding weakness wasn’t sudden. The signals existed months earlier in grant data and customer pipeline metrics. The gap was connecting those external signals to Bruker’s internal forecast.

Why This Shift Is Happening Now
Three forces are converging to raise the accountability standard:
1. Information Democratization
What used to require expensive consulting studies or specialized research teams is now accessible: regulatory filings, earnings transcripts, trade publications, industry data, social media postings and other unstructured data streams, macroeconomic indicators. The raw material for external monitoring exists at unprecedented scale and accessibility.
2. Technology Maturation
Tools now exist to monitor tens of thousands of external signals, identify correlations to specific company inputs, and quantify impacts—at speed and scale that wasn’t feasible even five years ago. The technical barriers to closing the input visibility gap are falling.
3. Stakeholder Expectations
Boards and investors increasingly expect proactive risk management, not reactive explanation. In a world where information exists and technology enables monitoring, “we didn’t see it” sounds less like bad luck and more like insufficient capability.
The combination means that what was once genuinely unforeseeable is now potentially visible—if you have a disciplined, comprehensive process in place.

What This Means for Planning
This shift creates both pressure and opportunity:
The Pressure: Executives can no longer rely on external attribution for all quarterly misses. When signals existed, were quantifiable, and there was time to act—”we didn’t see it coming” invites the harder question: Why weren’t we looking?
The Opportunity: Organizations that build external signal monitoring capability gain a meaningful advantage. Not perfect prediction—but earlier visibility into changes impacting plan assumptions, with quantified impact estimates, and enough lead time to decide whether to adjust or prepare.
The playbook is emerging:
• Monitor a vast array of external signals systematically (not just the “obvious” ones)
• Identify which signals correlate with YOUR specific inputs
• Quantify how changes in those signals impact YOUR plan
• Detect shifts weeks before they appear in internal dashboards
• Create decision windows: adjust actions or prepare communications
This isn’t about eliminating surprises entirely. External disruptions will always happen. But the distinction between “unforeseeable” and “undetected” matters more now than it used to.

The Choice Ahead
Here’s the uncomfortable question every planning team now faces:
When the next external disruption affects your quarter—and the board asks whether the signals existed, whether impact was quantifiable, and whether there was time to act—what will your answer be?
If the answer is “yes, yes, yes,” the follow-up is inevitable: Why didn’t we see it?
You can continue operating with the tools designed for the old standard—internal dashboards that show results after they’ve happened. Or you can build the capability the new standard demands: external signal monitoring that shows input changes as they emerge, allowing better management decisions.
The gap between what’s expected and what’s feasible is closing. The technology exists. The data exists. The question is whether you’ll prioritize building this capability before the next earnings call where you have to explain why you didn’t.
________________________________________
In our next post, we’ll shift from problem to solution: What can you actually DO with weeks of advance warning on input changes? We’ll walk through the Early Warning Playbook—the specific actions executives take when they see disruptions coming instead of discovering them in arrears.
________________________________________
The disruptions that will affect your next quarter are building right now. The signals exist. The question isn’t whether you COULD see them—it’s whether you WILL.