Jisho

×
Noun
1. amount of information
Noun
2. information content (information theory); Shannon information; surprisal
Wikipedia definition
3. Entropy (information theory)In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits. Equivalently, the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E.
Read “Entropy (information theory)” on English Wikipedia
Read “情報量” on Japanese Wikipedia
Read “Entropy (information theory)” on DBpedia

Discussions

to talk about this word.