Information and the logarithm

Share this page

Information and the logarithm


Let's write I(s,n) for the information of a string of length n made up of an alphabet of size s. Since we want the function I to be additive, we need I(s,n)=ksn, where ks=I(s,1). Our task is to find the ks, so that we can write down I(s,n) explicitly for all n. Let's fix some other positive whole number s1 and declare that I(s1,1)=k1 for some constant k1. This just means that we have fixed the value of the information in a single symbol from an alphabet of size s1. From Hartley's second rule we know that the function I must satisfy: If s1n1=sn, for some numbers n1 and n, then I(s1,n1)=k1×n1=ks×n=I(s,n). Therefore, ks=k1n1n. Now from (1) we see that n1=logs1(sn). Substituting this into (3) gives ks=k1logs1(sn)n. Therefore I(s,n)=ksn=k1logs1(sn). The k1 was just some constant we fixed above. We can actually make it disappear from the formula by absorbing it into the base of the logarithm. Suppose we want that base to be b. Let's choose k1=logb(s1). Then by the rule for changing the base of logarithms we get I(s,n)=logb(s1)logs1(sn)=logb(s1)logb(sn)logb(s1)=logb(sn), as required. Return to main article