The information entropy H, in bits, of N symbols per character position of an L-character sequence/string/word is: H(N) = L log2 N Using the N=26 symbols of English alphabet, what is the information entropy H, in bits, of L=1 letter words: L=2 letter words: L=3 letter words: etc. L=10 letter words: Doubling the length L of the word has what effect on the entropy H? Doubling the number of symbols N has what effect on the entropy H? Solve for N: How many symbols (i.e. N) would be needed to produce a 5-letter word with an information entropy H of 23.5? How many symbols (i.e. N) would be needed to produce a 5-letter word with an information entropy H of 30? of 50? How many symbols (i.e. N) would be needed to produce a 10-letter word with an information entropy H of 47? How many symbols (i.e. N) would be needed to produce a 10-letter word with an information entropy H of 30? of 50? of 100?