Work With Me — WhyThat.ai
Engagement

Work With Me

Training, advisory, and model development for teams building causal AI capability internally.

Who I Am
Marc Vandenplas

Marc Vandenplas — 25 years international experience in Causal AI, quantitative financial and operational risk modeling, infosec, infrastructure analytics, operations research, business engineering, and change management — at program and director level.

Credentials: Stanford AI · Johns Hopkins Data Science · MBA

Selected clients: UCSF · Zuckerberg General Hospital · NVIDIA · Amazon · PG&E · Merrill Lynch · Chevron

My Approach

Most consultants deliver a model and disappear. Six months later, no one remembers how it works. I do the opposite: every engagement includes hands-on training so your team can own, explain, and evolve the model independently.

Differentiators

CapabilityDistinction
Your team owns the modelKnowledge transfer built into every engagement
Tool-agnostic guidanceRecommendations based on requirements, not vendor relationships
Executive + technical fluencyI bridge executive strategy and technical implementation

You Might Be a Good Fit If…

  • Your decisions need to be explainable. Regulators, boards, or stakeholders ask "why?" and you need better answers than "the model said so."
  • You're tired of analytics that describe but don't prescribe. You have plenty of dashboards; what you need is guidance on what to do.
  • You're worried about losing institutional knowledge. Key experts are approaching retirement, or knowledge is siloed in ways that create risk.
  • You've been burned by black-box AI. You invested in sophisticated models that no one trusts or uses because no one understands them.
  • You need to test scenarios before committing. The cost of being wrong is high, so you want to simulate interventions before you make them.

I work best with mid-market and enterprise organizations (500–10,000 employees) in regulated industries: financial services, healthcare, insurance, and critical infrastructure.

Engagements
Workshop collaboration

Training & Workshops

Half-day to multi-day programs teaching Causal Networks, causal inference, and decision modeling.

On-site or virtual. Hands-on exercises with real tools.

Advisory & Consulting

Strategic guidance on implementing Causal Networks in your organization.

I help you scope projects, select tools, and build internal capability.

Model Development

I build custom Causal Networks for your domain — risk, customer analytics, operations, compliance.

Knowledge transfer to your team included.

Engagement Tiers

TierScopeBest For
Workshop1–3 days of focused trainingTeams learning Causal Network fundamentals
ProjectScoped engagement to build a specific modelOrganizations with a defined use case
FractionalOngoing part-time support (4–16 hrs/month)Organizations building long-term capability
Common Concerns

If you're skeptical, that's reasonable. Most "AI" promises have underdelivered. Here are the questions I hear most often:

"We've invested in AI before and didn't see returns. Why would this be different?"

Most enterprise AI is built to find patterns — which is useful for some things, but not for decisions. Pattern-matching tells you what happened; causal models tell you what to do. Different tool, different purpose.

"Our team isn't technical enough for this."

If your team can draw a flowchart showing how your business works — "this causes that, which affects this other thing" — they can learn to build causal models. The underlying math is handled by software. The hard part is knowing which arrows to draw, and that's domain expertise, not programming.

"How is this different from the analytics / BI tools we already have?"

Traditional analytics answers "what happened." Predictive models answer "what might happen next." Causal models answer "what happens if we act" and "what would have happened if we'd acted differently." The difference is intervention.

"This sounds like it requires a lot of up-front work."

It does — and that's a feature, not a bug. The process of building a causal model forces your team to articulate their assumptions explicitly. Often, this surfaces hidden disagreements. You can start small: a model with 5–10 variables can still deliver real insight.

"What if we get the causal structure wrong?"

You will, at first — and that's okay. Causal models are meant to be revised. The model makes your assumptions visible, so you can see where they break down. A wrong-but-visible assumption is much easier to fix than a wrong-and-hidden one.

"Do we need to hire new people?"

Your current analysts can learn this. The concepts aren't harder than regression or A/B testing — they're just different. A typical team gets productive within a few weeks of training.

Toolkit

These are the tools I use and recommend:

Bayes Server

Enterprise-grade Causal Networks with full causal ladder support

Netica

Accessible, visual network building for research and education

GeNIe / QGeNIe

BayesFusion's graphical modeling environment with qualitative extensions

Agena.ai

Cloud-based Bayesian network modeling with decision support

JASP

Free Bayesian statistics for analysts who want more than p-values

Orange

Visual data science without code

bnlearn

The complete R package for structure learning and inference

DPL

Decision analysis software for strategic planning

TreePlan

Excel add-in for decision trees

Tool Selection

The right tool depends on your use case, team skills, and integration requirements. I help you evaluate options and build capability with whichever platform fits your needs.

Let's Talk
Professional video consultation

We should have an initial conversation to assess fit — a focused discussion about whether Causal Networks address your specific requirements. We then scope a workshop, project, or fractional engagement. If not a fit, I will recommend a more suitable approach.

Begin the Conversation

Whether evaluating Causal Networks for the first time or prepared to develop production models, I welcome your inquiry.

Book a Call

or email: [email protected]