How is data measured? │ The History of Mathematics with Luc de Brabandère
What is the minimum possible piece of information?
This was a question that Claude Shannon answered to define the bit, the tiny unit by which we measure data today.
Join Luc de Brabandère, in the latest episode of The History of Maths, to understand Shannon’s Law, one of the pillars of information theory.
Find out more:
Claude Shannon devised a way of managing the smallest quantity of information and called it a bit. The rest is, well, history.
Subscribe now to our series ‘The History of Maths’ on the YouTube channel ‘What makes it tick?’
Thermodynamics, the science of heat, was developed dozens of years after the first steam machine. So, the theory came after the tool. And strangely enough, the same happened with information.
The first calculating machines were designed centuries ago. But one day, after World War II, somebody called Claude Shannon thought: ‘OK, I'd like to develop the science of information.’
Exactly like thermodynamics, but this time it's called information theory. When you develop thermodynamics, you need units to measure heat, like the calorie. That's the way Shannon started you see. If I want to develop the science of information, information theory, I need a unit.
And he came with an interesting question: ‘What is the minimum quantity of information possible?’ And the answer is the outcome of a very simple experiment with only two possible outputs.
Like the toss of a coin. If you take a coin, you flip the coin, I give you the answer as heads.
So, I provide you with some information, it's the smallest bit of information possible, the outcome of an experiment with only two possibilities. And that's how he connected 0 and 1.
So, many people like Boole who we saw in the previous videos defined the binary system as a world organised around 0s and 1s. But for Shannon, the bit, binary unit, was for him the minimum quantity of information possible. It's another way to define the bit.
Then he went one step further. If I take a dice and roll the dice, and I see it is a 5, I provide you with the answer, it's a 5. I give you more information than the head of the coin. And that's how he connected information and probability. The higher the probability, the smaller the information.
And finally, he said: ‘Wow, I'm gonna do the two experiments at once. I have in my hand the coin and the dice. I throw them both and I have an outcome, heads and 5. In fact, the information is the same if you have only one throw with the two objects or two different throws.’ And that's how he came to this formula you can see on the screen.
The only way to solve this formula is to introduce the idea of logarithm. And that's how Shannon connected information, probability and logarithm.
Again, we are connecting some concepts, and the more we go into those videos, the more you'll see things are connected together. Because in the end mathematics is a huge set of connected concepts.
Join us next time to understand how an ant's perspective led to the discovery of fractal geometry.
Subscribe now to follow our series on ‘The History of Maths’ on the YouTube channel ‘What makes it tick?’