send link to app

iEntropy


4.6 ( 8016 ratings )
Edukacja
Desenvolvedor: David Arcos Gutierrez
Darmowy

The Entropy of an information source is a measure of the uncertainty or the surprise on the information transmitted.

If our source sends a set of symbols (called source alphabet) the greatest surprise occurs when all the symbols are independent (without correlations) and have the same probability (equiprobability). In this case, the entropy of the source is equal to 1 because the receiver cant predict the symbols sent by the source.

The text sources havent got independent or equiprobable symbols. For this reason, we can predict some letters or words in an incomplete text. Moreover, this type of sources can be compressed using coding and compression data algorithms.

Huffman coding (David A. Huffman, 1925-1999) is an entropy encoding algorithm used for lossless data compression based on the assignation of short binary codes to high probability symbols and vice-versa. With this App you can use different text sources to generate Huffman codifications in order to see the reduction of the text source size.

Enter your own text, press the “Text Source” button and generate your own Huffman probability tree to study this data compression algorithm.