# SHAP

Learn about the SHAP explainability algorithm, which connects game theory with local explanations.

## We'll cover the following

## SHapely Additive exPlanations

**SHapley Additive exPlanations** **(SHAP)** is a popular explainability algorithm that connects game theory with local explanations. SHAP aims to explain the prediction for any input (e.g., an image) as a sum of contributions from its feature values (e.g., image pixels).

SHAP assumes that the individual features (e.g., image pixels) in the input (e.g., an image) participate in a cooperative game whose payout is the model prediction. The algorithm uses game theory to distribute the payout among these features fairly. The payout is known as the Shapely value of a feature.

### What are Shapely values?

Let’s assume that an image

Now, let

Let’s consider one such subset

Get hands-on with 1200+ tech skills courses.