Skip to content

amparore/shap_bpt

ShapBPT: Image Feature Attributions Using Data-Aware Binary Partition Trees

A python library to compute Shapley image explanations with data-aware Binary Partition Trees

Python Package 📦 :

Paper:

Results:


ShapBPT is a Python library for generating faithful, data-aware image explanations based on Owen values, a structured variant of Shapley values. Unlike classical SHAP partitioning, where features are grouped using axis-aligned, image-agnostic splits, ShapBPT constructs a Binary Partition Tree (BPT) that mirrors the morphology and structure of the image itself. This leads to explanations aligned with meaningful visual regions rather than arbitrary pixel blocks.

Why ShapBPT? Key Advantages

  • Model-agnostic: ShapBPT only requires a masking function, not access to model internals.
  • Data-aware explanation structure: Explanations follow meaningful image segments derived from the BPT, improving interpretability.
  • Computationally efficient Owen value approximation: Uses hierarchical Owen value recursion to reduce evaluation cost.
  • Drop-in replacement for SHAP Partition Explainer: By selecting method="AA" you can reproduce SHAP’s axis-aligned partitioning within the same interface.

What Is a Binary Partition Tree (BPT)?

A Binary Partition Tree is a hierarchical bottom-up segmentation of an image:

  • start from individual pixels,
  • iteratively merge adjacent regions that minimize a chosen distance metric,
  • continue until a single root region is formed.

This hierarchy provides ShapBPT with a data-aware coalition structure used to compute Shapley feature attributions. Where SHAP’s Partition Explainer uses Axis-Aligned splits, ShapBPT uses morphological clusters guided by the actual image content.

Because regions can be recursively split down to individual pixels, the method gracefully balances efficiency and fidelity.

How it is built and used

A Binary Partition Tree is built bottom-up, starting from the individual pixels and then merging adjacent regions that minimize a distance function, until all regions are merged into a single cluster. The tree that forms is the BPT tree.

In practice, the BPT hierarchy is used top-down, starting from the root cluster and splitting adaptively. ShapBPT uses the BPT hierarchy to generate feature attributions in the form of (Owen approximations) of the Shapley coefficients.

The ShapBPT python library follows the BPT hierachy to compute Shapley values. The resulting explanation follows the morphological regions pre-identified by the BPT algorithm, and therefore works under the assumption that the explained regions are somewhat identified by relevant image features. If this assumption does not hold for a given region, then such region can be split up, until the individual pixels are reached.

Using ShapBPT

Step 1. Define a masking function

ShapBPT requires a function

$$ \nu: N \times H \times W \rightarrow B \times M $$

that takes in input a set of $N$ binary masks and returns model predictions.

Step 2. Run the explainer

import shap_bpt

explainer = shap_bpt.Explainer(
    nu,                      # masking function ν
    image_to_explain,        # input image (HWC)
    num_explained_classes=4, # number of output classes to attribute
    verbose=True
)

shap_values = explainer.explain_instance(
    max_evals=eval_budget,   # maximum number of nu evaluations to generate the explanation 
    method="BPT",            # choose "BPT" (data-aware) or "AA" (axis-aligned)
    batch_size=batch_size
)

Step 3. Visualize the attribution map

shap_bpt.plot_owen_values(explainer, shap_values, class_names)

With eval_budget=100 and method='BPT' we obtain the explanation:

With eval_budget=100 and method='AA' we obtain the explanation:

See the provided notebooks for examples on how to setup and run ShapBPT.


How It Works: Owen Value Approximation

ShapBPT evaluates the Owen value over a BPT coalition structure. The library uses the formula

$$ \widehat{\Omega}_i(Q, T) = \begin{cases} \frac{1}{2} \widehat{\Omega}_i(Q,T_1) + \frac{1}{2} \widehat{\Omega}_i(Q\cup T_2, T_1) & \text{if } T^{\downarrow}=(T_1,T_2)\\ \frac{1}{|T|}\big(\nu(Q\cup T)-\nu(Q)\big) & \text{if $T$ is indivisible} \end{cases} $$

assuming, without loss of generality, that $i \in T_{1}$.

This hierarchical structure is what makes ShapBPT efficient, especially compared to full Shapley enumeration.


Installation

The easiest way to use ShapBPT is by installing it through PyPI:

pip install shap-bpt

You can also compile and install it from sources. Because ShapBPT includes a Cython module, compiling is required.

Unix systems

python setup.py build_ext --inplace

Windows systems

On Windows, the package can be compiled using ming32, with the command:

python setup.py build_ext --inplace --compiler=mingw32

Recommended: To install mingw using conda commands recommended on this page to setup a working mingw32 system, Run following lines.

conda install numpy libpython m2w64-toolchain cython

Note: Make sure that environment having cython is activated before running above line of code.

Alternativly, Follow the instruction on this page

Package installation (all systems)

After compiling, the ShapBPT python module can be installed using:

python setup.py install

To clean up the folder from the intermediate build files, use:

python setup.py clean --all

Examples

📖 Citation

If you use this work in your research, please cite:

@inproceedings{rashid2026shapbpt,
  title={{ShapBPT: Image Feature Attributions Using Data-Aware Binary Partition Trees}},
  author={Rashid, Muhammad and Amparore, Elvio G and Ferrari, Enrico and Verda, Damiano},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  volume={40},
  number={30},
  pages={25099--25107},
  year={2026},
  url={https://doi.org/10.1609/aaai.v40i30.39699}
}

Code of Conduct

This project follows a Code of Conduct.
By participating, you agree to uphold these standards.

About

ShapBPT: Image Feature Attributions using Data-Aware Binary Partition Trees

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors