Omni Calculator logo

Shannon Entropy Calculator

Created by Julia Żuławińska
Reviewed by Bogna Szyk and Jack Bowater
Last updated: Apr 23, 2024


Welcome to the Shannon entropy calculator! The entropy of an object or a system is a measure of the randomness within the system. In physics, it's determined by the energy unavailable to do work. In this article, we will explain the form of entropy used in statistics - information entropy. Keep reading to find out how to calculate entropy using the Shannon entropy formula.

Have any other statistical problems, or just interested in the topic? Check out our normal distribution calculator!

How is Shannon entropy used in information theory?

Shannon entropy, also known as information entropy or the Shannon entropy index, is a measure of the degree of randomness in a set of data.

It is used to calculate the uncertainty that comes with a certain character appearing next in a string of text. The more characters there are, or the more proportional are the frequencies of occurrence, the harder it will be to predict what will come next - resulting in an increased entropy. When the outcome is certain, the entropy is zero.

Apart from information theory, Shannon entropy is used in many fields. Ecology, genetics, and computer sciences are just some of them.

How to calculate entropy? - entropy formula

Shannon's entropy formula is:

H(x)=i=1n[P(xi)logbP(xi)]=i=1n[P(xi)logb(1P(xi))]\footnotesize \begin{split} H(x) &= - \textstyle\sum_{i=1}^n [P(x_i) \cdot \log_bP(x_i)] \\ &= \textstyle\sum_{i=1}^n [P(x_i) \cdot \log_b(\frac{1}{P(x_i))}] \end{split}

, where

  • i=1n\footnotesize \textstyle\sum_{i=1}^n is a summation operator for probabilities from i to n.
  • P(xi)\footnotesize P(x_i) is the probability of a single event.

In information theory, entropy has several units. It depends on what the base of the logarithm - b\footnotesize b - is. Usually, as we're dealing with computers, it's equal to 2 and the unit is known as a bit (also called a shannon). Our Shannon entropy calculator uses this base. When the base equals Euler's number, e, entropy is measured in nats. If it's 10, the unit is a dit, ban or hartley.

Let's use Shannon entropy formula in an example:

  1. You have a sequence of numbers: 1 0 3 5 8 3 0 7 0 1\footnotesize 1\space0\space3\space5\space8\space3\space0\space7\space0\space1.
  2. Each distinct character has a different probability associated with it occurring:
  • p(1)=2/10\footnotesize p(1) = 2 / 10.
  • p(0)=3/10\footnotesize p(0) = 3 / 10.
  • p(3)=2/10\footnotesize p(3) = 2 / 10.
  • p(5)=1/10\footnotesize p(5) = 1 / 10.
  • p(8)=1/10\footnotesize p(8) = 1 / 10.
  • p(7)=1/10\footnotesize p(7) = 1 / 10.
  1. Shannon entropy equals:
    H=p(1)log2(1p(1))+p(0)log2(1p(0))+p(3)log2(1p(3))+p(5)log2(1p(5))+p(8)log2(1p(8))+p(7)log2(1p(7))\footnotesize H = p(1) * log_2(\frac{1}{p(1)}) + p(0) * log_2(\frac{1}{p(0)}) + p(3) * log_2(\frac{1}{p(3)}) + p(5) * log_2(\frac{1}{p(5)}) + p(8) * log_2(\frac{1}{p(8)}) + p(7) * log_2(\frac{1}{p(7)})
  • After inserting the values:
    H=0.2log2(10.2)+0.3log2(10.3)+0.2log2(10.2)+0.1log2(10.1)+0.1log2(10.1)+0.1log2(10.1)\footnotesize H = 0.2 * log_2(\frac{1}{0.2}) + 0.3 * log_2(\frac{1}{0.3}) + 0.2 * log_2(\frac{1}{0.2}) + 0.1 * log_2(\frac{1}{0.1}) + 0.1 * log_2(\frac{1}{0.1}) + 0.1 * log_2(\frac{1}{0.1})
  • H=2.44644\footnotesize H = 2.44644.

Know you know how to calculate Shannon entropy on your own! Keep reading to find out some facts about entropy!

Fun facts about entropy - entropy symbol, password entropy

  • The term "entropy" was first introduced by Rudolf Clausius in 1865. It comes from the Greek "en-" (inside) and "trope" (transformation). Before, it was known as "equivalence-value". In physics and chemistry, the entropy symbol is a capital S. It's said to have been chosen by Clausius in honor of Sadi Carnot (the father of thermodynamics). In information theory, the entropy symbol is usually the capital Greek letter for 'eta' - H.

  • You may also come across the phrase 'password entropy'. It's a measurement of how random a password is. It takes into account the number of characters in your password and the pool of unique characters you can choose from (e.g., 26 lowercase characters, 36 alphanumeric characters). The higher the entropy of your password, the harder it is to crack.

  • Ecologists use entropy as a diversity measure. From an ecological point of view, it is best if the terrain is species-differentiated. Higher entropy means greater diversity.

  • Programmers deal with a particular interpretation of entropy called programming complexity: learn more at our cyclomatic complexity calculator.

Enjoyed our Shannon entropy calculator? Check out the birthday paradox calculator next!

Julia Żuławińska
Probabilities (you may enter up to 10 values)
Event 1
Event 2
Event 3
The Shannon entropy is 0
Check out 182 similar statistics calculators
10 sided dice roller2 dice roller4 sided dice roller… 179 more
People also viewed…

First quartile

Our first quartile calculator will teach you all there is about the lower quartile.

IQR

The IQR calculator allows you to find the interquartile range of up to 50 values.

Meat footprint

Check out the impact meat has on the environment and your health.

Schwarzschild radius

Calculate the gravitational acceleration at the event horizon of a black hole of a given mass using the Schwarzschild radius calculator.
Copyright by Omni Calculator sp. z o.o.
Privacy, Cookies & Terms of Service