entropy
n.
1. a thermodynamic quantity providing a measure of the unavailability of the energy of a closed system to do work. 2. in statistics, an indication of the degree of disorder, disequilibrium, or change of a closed system. For example, an entropy measure may be used to indicate the quality of classification in a cluster analysis. Clusters with high similarity would have low entropy, whereas clusters with more dispersion would have higher entropy. 3. in information theory, a measure of the efficiency with which a system transmits information.