Constraint (information theory)
Constraint in information theory is the degree of statistical dependence between or among variables.
Garner[1] provides a thorough discussion of various forms of constraint (internal constraint, external constraint, total constraint) with application to pattern recognition and psychology.
See also
- Mutual Information
- Total Correlation
- Interaction information
References
- Garner W R (1962). Uncertainty and Structure as Psychological Concepts, John Wiley & Sons, New York.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.