Interpretable
Interpretable is an adjective describing something that can be interpreted or understood. In everyday use it often refers to explanations that are clear and comprehensible. In the field of data science and artificial intelligence, interpretability denotes the extent to which a human can understand the cause of a model’s predictions or decisions.
Two broad notions are common. Inherently interpretable models are designed so their behavior is understandable without
Key concepts used to assess interpretability include transparency (visibility into the model’s structure and parameters), decomposability
In practice, there is often a trade-off between interpretability and predictive performance. Simpler, interpretable models may