Differentiable Constraint-Based Causal Discovery

Abstract

Causal discovery from observational data is a fundamental task in artificial intelligence, with far-reaching implications for decision-making, predictions, and interventions. Despite significant advances, existing methods can be broadly categorized as constraint-based or score-based approaches. Constraint-based methods offer rigorous causal discovery but are often hindered by small sample sizes, while score-based methods provide flexible optimization but typically forgo explicit conditional independence testing. This work explores a third avenue: developing differentiable d-separation scores, obtained through a percolation theory using soft logic. This enables the implementation of a new type of causal discovery method: gradient-based optimization of conditional independence constraints. Empirical evaluations demonstrate the robust performance of our approach in low-sample regimes, surpassing traditional constraint-based and score-based baselines on a real-world dataset. Code and data of the proposed method are publicly available at https://github.com/PurdueMINDS/DAGPA.

Publication
NeurIPS 2025
Jincheng Zhou
Jincheng Zhou
Ph.D. Student

My research interests include graph representation learning and applications to knowledge graphs, causal inference and causal structural discovery, meta learning, cognitive architectures, and artificial general intelligence.