Back to Insights
Strategy

Making consequential bets when the data does not exist

Mosaic Consulting · 7 min read · January 2026

The standard toolkit of strategic decision-making was built for a world that offers data. Market research. Historical performance curves. Competitive benchmarking. Scenario models that can be calibrated against known parameters. These tools work reasonably well when the question being answered has been answered before, in similar contexts, with measurable results. They become unreliable, and sometimes actively misleading, when the situation is genuinely novel.

What makes a situation genuinely novel is not just the absence of data. It is the absence of a framework that would tell you what data to look for. In stable markets, even uncertain decisions have a structure: you are estimating the probability of outcomes you can enumerate. In genuinely novel situations, you cannot enumerate the outcomes. The space of possibilities is undefined. Standard probability-based reasoning begins to fail precisely when the stakes are highest.

The problem with frameworks

Strategic frameworks are designed to produce structure when you apply them to a situation. That is their purpose, and it is genuinely useful in situations where the structure is actually there and the framework is surfacing it. The problem arises when the situation does not have the structure the framework assumes. Frameworks applied in these conditions do not reveal the shape of the decision; they impose a false shape on it.

This shows up in practice as analysis that is technically rigorous but produces misplaced confidence. The market sizing looks precise. The competitive matrix looks comprehensive. The financial model has narrow confidence intervals. And underneath all of it is a set of assumptions about how the market works, how competitors will respond, and what customers actually want, that nobody has been able to verify because the market does not yet exist, or exists in a form that has no historical analogue.

Experienced executives recognize this pattern. They can feel the difference between analysis that is genuinely illuminating and analysis that is filling space. What they often struggle with is how to act under genuine uncertainty without either paralysis or unfounded confidence. Neither of those serves the organization.

What good judgment looks like in the absence of data

Across the leaders we have watched navigate novel decisions well, a set of practices emerges that is distinct from standard decision analysis. These are not frameworks so much as disciplines, ways of thinking that reduce the probability of bad outcomes without requiring data that does not exist.

They separate the reversible from the irreversible. The most consequential aspect of a decision under uncertainty is often not the expected value of the outcome but the degree to which a bad outcome can be corrected. Decisions that foreclose options are genuinely different from decisions that open them, even if the expected values appear similar. Experienced decision-makers spend disproportionate attention on reversibility, not because they expect to be wrong, but because they take seriously that they might be.

They reason from mechanisms, not from correlations. When historical data is thin or absent, correlational patterns from prior situations have limited predictive value. What remains useful is causal reasoning: why would this outcome occur? What mechanism produces it? If you can trace the causal chain clearly enough, you can sometimes make confident predictions about novel situations without historical data to anchor them. This requires genuine understanding of the underlying dynamics, not pattern matching from surface features.

They seek disconfirmation actively. The standard approach to building a business case involves marshaling evidence that supports the decision. Under genuine uncertainty, this produces a selection effect: you find the evidence you are looking for and miss what does not fit. The leaders who navigate uncertainty better have a deliberate practice of looking for reasons they might be wrong. Not performatively, but genuinely. They assign someone whose job is to make the strongest possible case against the decision, and they take that case seriously.

They are precise about what they do not know. Uncertainty is not uniform. Within a novel situation, some things are genuinely unknown, and some things can be known with reasonable confidence if the right work is done. The disciplined response to uncertainty involves mapping it carefully: which of the key assumptions are most uncertain, which uncertainties matter most for the outcome, and which can be reduced with targeted investigation. This is different from generic acknowledgment that the situation is uncertain.

The organizational dimension

Individual judgment under uncertainty is only part of the challenge. Decisions of consequence in large organizations are made by groups, and the group dynamics that govern how decisions get made often work systematically against good judgment under uncertainty.

Organizations have strong incentives to produce certainty, even where none exists. Presenting a decision to a board or an executive committee with genuine acknowledgment of irreducible uncertainty is uncomfortable for both sides. The pressure to convert uncertainty into risk, to give everything a probability and a range, produces false precision that serves organizational comfort more than decision quality.

The leaders who navigate this well are explicit about the difference between quantified risk and genuine uncertainty. They resist the demand to produce precise numbers when the underlying situation does not support them. This requires both the credibility to make the argument and the willingness to accept the organizational discomfort it creates.

There is also a sequencing discipline that matters. When the key unknowns have been identified, the question becomes: which of these can be reduced before a commitment is required, and at what cost? The decision to invest in resolving uncertainty before making the primary decision is itself a decision that benefits from analysis. Not all uncertainty can be reduced before you have to act, but some can, and the organizations that act prematurely on avoidable uncertainty pay for it unnecessarily.

Why experience matters here

The practices described above are learnable, but they are not easily acquired from instruction alone. They are refined through repeated exposure to situations where the consequences of getting it wrong are real and visible. Executives who have made consequential decisions under genuine uncertainty and tracked the outcomes, honestly, over time develop a calibration that is difficult to acquire any other way.

This is partly why the quality of strategic advice degrades sharply in novel situations. The value of external input is highest when the advisors have actually been through situations with similar structural features, not similar surface features, and have observed what determined the outcomes. Pattern recognition built from lived experience is genuinely different from pattern matching from case studies or industry reports. The former is calibrated against reality. The latter is calibrated against how situations got written up after the fact.

None of this resolves the fundamental challenge of consequential decisions without data. But it changes the nature of the task from trying to manufacture certainty to operating well under conditions of genuine uncertainty. That is a more honest framing, and it tends to produce better decisions.

Mosaic Consulting

January 2026

Facing a decision where the data is thin?

Start a conversation