Tag

model interpretation

0 views collected around this technical thread.

Model Perspective
Model Perspective
Oct 31, 2022 · Artificial Intelligence

Understanding SHAP: How Shapley Values Explain Black‑Box Models

This article explains the SHAP (Shapley Additive Explanation) method, its theoretical foundations in game theory, the computation of Shapley Values, various algorithmic approximations like TreeSHAP and DeepSHAP, practical code examples, and the strengths and limitations of using SHAP for model interpretability.

SHAPShapley Valuesexplainable AI
0 likes · 11 min read
Understanding SHAP: How Shapley Values Explain Black‑Box Models
Model Perspective
Model Perspective
Sep 3, 2022 · Fundamentals

How to Explain Your HiMCM Model Using Virtual Data and Insightful Charts

This article explains how to interpret a HiMCM 2020 A‑problem model by creating diverse fictional personas, generating appropriate virtual data, analyzing results, and using effective visualizations such as radar, bar, and table charts to clearly communicate model insights.

HiMCMchart visualizationmodel interpretation
0 likes · 6 min read
How to Explain Your HiMCM Model Using Virtual Data and Insightful Charts
AntTech
AntTech
May 22, 2018 · Artificial Intelligence

Unpack Local Model Interpretation for GBDT – Summary and Analysis

This article summarizes the Ant Financial paper presented at DASFAA 2018 that proposes a universal local explanation method for Gradient Boosting Decision Tree models, detailing the problem definition, the PMML‑based algorithm for attributing feature contributions, experimental validation on fraud detection data, and the practical benefits for model transparency and improvement.

GBDTPMMLfeature importance
0 likes · 12 min read
Unpack Local Model Interpretation for GBDT – Summary and Analysis