pgmpy is a Python library for causal and probabilistic modeling using graphical models. It provides a uniform API for building, learning, and analyzing models such as Bayesian Networks, Dynamic Bayesian Networks, Directed Acyclic Graphs (DAGs), and Structural Equation Models(SEMs). By integrating tools from both probabilistic inference and causal inference, pgmpy enables users to seamlessly transition between predictive and interventional analyses.
Documentation · Examples . Tutorials | |
---|---|
Open Source | |
Tutorials | |
Community | |
CI/CD | |
Code | |
Downloads |
Feature | Description |
---|---|
Causal Discovery / Structure Learning | Learn the model structure from data, with optional integration of expert knowledge. |
Causal Validation | Assess how compatible the causal structure is with the data. |
Parameter Learning | Estimate model parameters (e.g., conditional probability distributions) from observed data. |
Probabilistic Inference | Compute posterior distributions conditioned on observed evidence. |
Causal Inference | Compute interventional and counterfactual distributions using do-calculus. |
Simulations | Generate synthetic data under specified evidence or interventions. |
- Example Notebooks: Examples
- Tutorial Notebooks: Tutorials
- Blog Posts: Medium
- Documentation: Website
- Bug Reports and Feature Requests: GitHub Issues
- Questions: discord · Stack Overflow
pgmpy is available on both PyPI and anaconda. To install from PyPI, use:
pip install pgmpy
To install from conda-forge, use:
conda install conda-forge::pgmpy
from pgmpy.utils import get_example_model
# Load a Discrete Bayesian Network and simulate data.
discrete_bn = get_example_model("alarm")
alarm_df = discrete_bn.simulate(n_samples=100)
# Learn a network from simulated data.
from pgmpy.estimators import PC
dag = PC(data=alarm_df).estimate(ci_test="chi_square", return_type="dag")
# Learn the parameters from the data.
dag_fitted = dag.fit(alarm_df)
dag_fitted.get_cpds()
# Drop a column and predict using the learned model.
evidence_df = alarm_df.drop(columns=["FIO2"], axis=1)
pred_FIO2 = dag_fitted.predict(evidence_df)
# Load an example Gaussian Bayesian Network and simulate data
gaussian_bn = get_example_model("ecoli70")
ecoli_df = gaussian_bn.simulate(n_samples=100)
# Learn the network from simulated data.
from pgmpy.estimators import PC
dag = PC(data=ecoli_df).estimate(ci_test="pearsonr", return_type="dag")
# Learn the parameters from the data.
from pgmpy.models import LinearGausianBayesianNetwork
gaussian_bn = LinearGausianBayesianNetwork(dag.edges())
dag_fitted = gaussian_bn.fit(ecoli_df)
dag_fitted.get_cpds()
# Drop a column and predict using the learned model.
evidence_df = ecoli_df.drop(columns=["ftsJ"], axis=1)
pred_ftsJ = dag_fitted.predict(evidence_df)
import pyro.distributions as dist
from pgmpy.models import FunctionalBayesianNetwork
from pgmpy.factors.hybrid import FunctionalCPD
# Create a Bayesian Network with mixture of discrete and continuous variables.
func_bn = FunctionalBayesianNetwork(
[
("x1", "w"),
("x2", "w"),
("x1", "y"),
("x2", "y"),
("w", "y"),
("y", "z"),
("w", "z"),
("y", "c"),
("w", "c"),
]
)
# Define the Functional CPDs for each node and add them to the model.
cpd_x1 = FunctionalCPD("x1", fn=lambda _: dist.Normal(0.0, 1.0))
cpd_x2 = FunctionalCPD("x2", fn=lambda _: dist.Normal(0.5, 1.2))
# Continuous mediator: w = 0.7*x1 - 0.3*x2 + ε
cpd_w = FunctionalCPD(
"w",
fn=lambda parents: dist.Normal(0.7 * parents["x1"] - 0.3 * parents["x2"], 0.5),
parents=["x1", "x2"],
)
# Bernoulli target with logistic link: y ~ Bernoulli(sigmoid(-0.7 + 1.5*x1 + 0.8*x2 + 1.2*w))
cpd_y = FunctionalCPD(
"y",
fn=lambda parents: dist.Bernoulli(
logits=(-0.7 + 1.5 * parents["x1"] + 0.8 * parents["x2"] + 1.2 * parents["w"])
),
parents=["x1", "x2", "w"],
)
# Downstream Bernoulli influenced by y and w
cpd_z = FunctionalCPD(
"z",
fn=lambda parents: dist.Bernoulli(
logits=(-1.2 + 0.8 * parents["y"] + 0.2 * parents["w"])
),
parents=["y", "w"],
)
# Continuous outcome depending on y and w: c = 0.2 + 0.5*y + 0.3*w + ε
cpd_c = FunctionalCPD(
"c",
fn=lambda parents: dist.Normal(0.2 + 0.5 * parents["y"] + 0.3 * parents["w"], 0.7),
parents=["y", "w"],
)
func_bn.add_cpds(cpd_x1, cpd_x2, cpd_w, cpd_y, cpd_z, cpd_c)
func_bn.check_model()
# Simulate data from the model
df_func = func_bn.simulate(n_samples=1000, seed=123)
# For learning and inference in Functional Bayesian Networks, please refer to the example notebook: https://github.com/pgmpy/pgmpy/blob/dev/examples/Functional_Bayesian_Network_Tutorial.ipynb
We welcome all contributions --not just code-- to pgmpy. Please refer out contributing guide for more details. We also offer mentorship for new contributors and maintain a list of potential mentored projects. If you are interested in contributing to pgmpy, please join our discord server and introduce yourself. We will be happy to help you get started.