Validating Your Product

Why Surveys Rarely Validate Products

Stated vs revealed behavior, survey optimism bias, and incentive distortion. Why behavior-first validation outperforms asking questions.

Surveys measure opinions. Products need behaviors.

The gap between what people say they'll do and what they actually do is one of the most replicated findings in behavioral science. Yet surveys remain the most common validation method for new products. This mismatch produces validation data that is systematically optimistic.

Stated vs revealed behavior

Stated behavior: "I would definitely use a product that does X." Revealed behavior: The same person doesn't download, try, or pay for a product that does X.

The gap is not dishonesty — it's a genuine cognitive limitation. People cannot accurately predict their own future behavior, especially for products they haven't used.

Survey optimism bias

Surveys consistently overestimate: - Purchase intent (by 40-60% on average) - Willingness to pay (by 20-40%) - Frequency of use (by 30-50%) - Switching likelihood (by 50-70%)

This isn't design flaw — it's structural. Answering a survey costs nothing, requires no trade-offs, and activates aspirational rather than practical thinking.

Incentive distortion

Survey respondents are typically incentivized (directly or indirectly) to complete the survey. This creates: - Completion bias (answering to finish, not to be accurate) - Positivity bias (positive answers feel like they're more "helpful") - Social desirability bias (answering in ways that present the respondent favorably)

The behavior-first validation model

Instead of asking, observe:

  1. Search behavior: Are people searching for solutions to this problem?
  2. Current workarounds: Are people cobbling together solutions from existing tools?
  3. Spending behavior: Are people paying for inferior alternatives?
  4. Time investment: Are people spending significant time on manual processes that your product would automate?

These behavioral signals are harder to collect but dramatically more predictive.

Interview redesign framework

If you must talk to potential users: - Ask about past behavior, not future intent ("When did you last...?" not "Would you...?") - Ask about current pain, not hypothetical solutions - Look for emotional intensity — frustrated complaints are stronger signals than polite interest - Ask what they've already tried and what failed - Never describe your solution before understanding their problem

How this decision shapes execution

Products validated by surveys are optimized for stated preferences rather than revealed behavior. The feature set reflects what people said they wanted, not what they'll actually use. The execution path built on survey data produces products that test well in research and fail in market — because the market rewards behavior, not opinions.

Related Decision Framework

This article is part of a decision framework.

The Validate or Pretend decision covers the structural question behind this topic. If you are facing this decision now, the full framework is here.

Read the Validate or Pretend framework →

Working through this decision?

Start with a Clarity Sprint →