(symbol: d′) a measure of an individual’s ability to detect signals; more specifically, a measure of sensitivity or discriminability derived from signal detection theory that is unaffected by response biases. It is the difference (in standard deviation units) between the means of the noise and signal + noise distributions. The assumptions underlying the validity of d′ as a bias free measure are that the probability distributions upon which decisions are based are Gaussian (normal) and have equal variances. If this is true, then d′ completely describes the receiver-operating characteristic curve. In practice, d′ has proved to be sufficiently bias free to be the “best” measure of psychophysical performance. It is essentially a standardized score and is computed as the difference between the (Gaussian) standard scores for the false-alarm rate and the
hit rate. A value of d′ = 3 is close to perfect performance; a value of d′ = 0 is chance (“guessing”) performance.