See It Work — WhyThat.ai
Worked Example

See It Work.

A bank's data blamed support calls for churn. The causal model revealed the opposite. Walk through all three rungs of Pearl's Ladder.

The Pattern That Fooled a Bank

A bank's analytics team noticed something troubling: customers who called support were three times more likely to close their accounts. The correlation was strong and consistent.

The obvious conclusion: support calls are associated with churn. Maybe the support experience is bad. Maybe calling is a sign of problems. Either way, perhaps the bank should make it harder to reach support.

But something didn't sit right. The support team pushed back. They knew they were helping customers. So the analysts dug deeper.

What They Found

When they separated satisfied customers from dissatisfied customers, the pattern reversed:

  • Among satisfied customers: those who called churned less (3% vs 5%)
  • Among dissatisfied customers: those who called churned less (50% vs 70%)

In every segment, calling support reduced churn. But in the aggregate data, callers churned more. How is that possible?

The answer: dissatisfied customers are more likely to call support and more likely to churn. Dissatisfaction was the hidden cause driving both behaviors. The aggregate data blamed support calls for something they didn't cause.

This is Simpson's Paradox, and it's far more common than most analysts realize. The causal model revealed it; correlation alone would have led to exactly the wrong decision.

The correct action: encourage support calls, especially for at-risk customers. What looked like a problem was actually the solution.

Climb the Ladder

Here's how the same data looks at each rung of Pearl's Ladder — from correlation to intervention to counterfactual.

Seeing (Association): "What happened?" — Correlation can answer this.
Doing (Intervention): "What if I act?" — Only Causal Models can answer this.
Imagining (Counterfactual): "What if I had done differently?" — Only Structured Causal Models can answer this.
Make It Queryable

The technical interface is powerful, but not everyone needs to use it directly. An LLM layer lets anyone on your team ask questions in plain English and get answers grounded in the causal model.

💬 Churn Analysis Assistant
This customer churned without calling support. Would a call have saved them?

Based on this customer's profile, I ran a counterfactual analysis:

  • Without call: 100% churn (what happened)
  • With call: 50% churn probability

Conclusion: Yes — a support call would have given this customer a 50-50 chance of staying.

How many similar customers do we have right now?

Checking your current customer base… 847 customers match this risk profile and haven't contacted support recently. Want me to generate a prioritized outreach list?

A Week in the Life

What does it actually look like when your team has causal modeling capability? A composite week drawn from real engagements:

Monday — Marketing Attribution

The marketing director wants to know if last month's email campaign actually drove conversions or just correlated with a seasonal uptick. An analyst queries the causal model, separates the campaign effect from seasonality, and reports back by lunch: "The campaign drove a 4.2% lift, controlling for seasonal effects."

Tuesday — Regulatory Explanation

Legal needs to explain to regulators why a loan application was declined. Pull up the causal model, show exactly which factors contributed, by how much, and why they're causally relevant. Twenty minutes instead of two days.

Wednesday — Strategy Scenario

The executive team is evaluating an acquisition. The analyst runs three scenarios through the causal model: optimistic, neutral, and pessimistic. Each shows expected churn rates with uncertainty bands.

Thursday — Debugging a Decision

A process change last quarter was supposed to reduce costs but didn't. The counterfactual analysis reveals the change did reduce costs — but supplier price increases masked the effect. Without the change, costs would have risen 8%.

Friday — Knowledge Transfer

A senior underwriter is retiring next month. She works with an analyst to encode her reasoning into a causal model. Next year's new hires will query that model and get answers reflecting decades of expertise.

Medical Counterfactual

A Medical Counterfactual, Step by Step

Network connections

"Would this patient have avoided hospitalization if we had prescribed statins?" A full walkthrough of the counterfactual pipeline — from global model to patient-specific answer. See the walkthrough →

Ready to see what this looks like for your data?

A short conversation is the fastest way to find out whether causal networks fit your problem.

Book a Call

or email: [email protected]