dprime
dprime, often written d', is a sensitivity measure in signal detection theory that quantifies an observer's ability to distinguish signal from noise. In the standard model, the internal responses to noise and to signal plus noise are assumed to be normally distributed with equal variance. d' represents the separation of these distributions in units of the shared standard deviation and is equivalently computed as the difference between the z-scores of the hit rate and the false alarm rate: d' = z(H) − z(F).
Calculation typically uses data from a yes/no detection task. H is the proportion of signal trials correctly
Interpretation: d' measures discriminability independent of response bias. A larger d' signals greater sensitivity to detect
Extensions and notes: If the equal-variance assumption is violated, alternative methods or nonparametric measures (such as