Can you measure information? │ The History of Mathematics with Luc de Brabandère
How do we measure information?
And what is the role of probability theory in sending it?
In the final episode of his series The History of Mathematics, Luc and Mérouane Debbah, Director of Huawei’s Mathematical and Algorithmic Sciences lab in Paris, explore the connection between mathematics and information.
Find out more:
We have seen how machines are fed with data to become more and more useful to humans.
And today I am again with Mérouane, director of this Huawei lab in Paris and
I'd like to understand a bit better this connection between mathematics and information.
So, Mérouan, is this effort to measure information a recent development?
Mérouane Debbah:
No, the importance of giving valued information became quite clear when we started building the first telecommunication network. More than 100 years ago in order to send a message one
would go to what is called ‘the post and telegraph office’ and will present his text to an operator. The operator would convert that text into an electrical current which had a pattern related to that text and on the other side there was another operator which would convert that electrical current pattern into the initial text.
Quite rapidly, some smart people understood how to exploit the system they were putting some boots at the entrance of the telegraph office and were asking to the people entering with an English text to translate it into another text which contained much lesser letters. Typically, if you had a text in English you would convert it in French and basically by having less letters it will cost you less to transmit the same meaning to the other side or to your receiver.
The different network operators rapidly understood the objective of finding a way to measure basically the value of information and finding a universal basis in which you could translate a
text with the minimum representation. Mathematicians work quite hard on their topic and it was a mathematician from Bell Labs who formalized the problem to a notion called ‘information theory.’
It turns out that the value of information is highly related to probability theory. Just to give you an example. Suppose an event will happen almost surely, then there's no reason to transmit it to your receiver because he already knows the result. You will transmit basically information whenever the receiver has no clue about the information that you're sending. So, there's a strong link between property theory and information theory and it was Shannon in 1948 who made that strong connection.
Today, the measure of information to the notion of entropy is highly present or omnipresent in all, basically, the ICT industry.
It goes from images to videos to propagation of network to seller networks and also basically to the whole IT industry that we're building today.
Watch the full series in the YouTube channel ‘What makes it tick?’