About 50 results
Open links in new tab
  1. API Reference — SHAP latest documentation

    This page contains the API reference for public objects and functions in SHAP. There are also example notebooks available that demonstrate how to use the API of each object/function.

  2. decision plot — SHAP latest documentation

    SHAP Decision Plots SHAP decision plots show how complex models arrive at their predictions (i.e., how models make decisions). This notebook illustrates decision plot features and use cases with …

  3. shap.plots.force — SHAP latest documentation

    For SHAP values, it should be the value of explainer.expected_value. However, it is recommended to pass in a SHAP Explanation object instead (shap_values is not necessary in this case).

  4. Image examples — SHAP latest documentation

    Image examples These examples explain machine learning models applied to image data. They are all generated from Jupyter notebooks available on GitHub. Image classification Examples using …

  5. Explaining quantitative measures of fairness — SHAP latest …

    By using SHAP (a popular explainable AI tool) we can decompose measures of fairness and allocate responsibility for any observed disparity among each of the model’s input features.

  6. Be careful when interpreting predictive models in search of causal ...

    SHAP and other interpretability tools can be useful for causal inference, and SHAP is integrated into many causal inference packages, but those use cases are explicitly causal in nature.

  7. Release notes — SHAP latest documentation

    Nov 11, 2025 · This release incorporates many changes that were originally contributed by the SHAP community via @dsgibbons 's Community Fork, which has now been merged into the main shap …

  8. scatter plot — SHAP latest documentation

    The y-axis is the SHAP value for that feature (stored in explanation.values), which represents how much knowing that feature’s value changes the output of the model for that sample’s prediction.

  9. shap.plots.waterfall — SHAP latest documentation

    The waterfall plot is designed to visually display how the SHAP values (evidence) of each feature move the model output from our prior expectation under the background data distribution, to the final model …

  10. shap.TreeExplainer — SHAP latest documentation

    Uses Tree SHAP algorithms to explain the output of ensemble tree models. Tree SHAP is a fast and exact method to estimate SHAP values for tree models and ensembles of trees, under several …