Englishfor English speakers
information entropy
English
Meaning information entropy meaning
What does information entropy mean?
information entropy
noun
—
(information theory) A measure of the uncertainty associated with a random variable ; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.
Are you looking for...?