Claude Shannon is the founding father of information theory, as we already explained. He developed a method to measure the information content of any language.
It was essentially showing samples of normal texts in a language cut at a random point so that a person would then have to guess which letter came next. Based on the hit rates and rigorous mathematical analysis, Shannon determined that the information contained in normal English text was 1.0 to 1.2 bits per letter .
Tweets with meaning in English
Based on this data from Shannon, Randall Munroe did some calculations regarding the tweets that could be written on Twitter in his book What If …? :
If a text contains n bits of information, in a way it means that there are 2 raised to n different messages that it can express. This requires doing some math juggling (taking into account, among other things, the length of the message and something called "uniqueness distance"), but the bottom line is that it suggests there are on the order of about 2 to the 140×1.1 level, i.e. , 2 x 10 raised to 46 tweets with meaning in English.
Now imagine that the world’s population had to read aloud this massive list of 140-character tweets. That is 10 to 47 seconds .
It’s such a staggeringly high number of tweets that it almost doesn’t matter if one person or a billion read them, they won’t be able to make any significant progress up the list in all of life on Earth.
Naturally, we are talking about all unique tweets, not repeated tweets.