【正文】
y of the book and methods of learning it 24 1. Information Content If the probabiltiy of transmitting jth message is Pj, the information content sent from a digital source when it was transmitted is given by: Information Measure bi tPPI jjj )(l og)1(l og22 ???25 Information Measure From the definition, we have the following results: 1) The message that are less likely to occur provide more information 2) Information content depends on only the likehood of sending the message and does not depend on possible interpretation of the content as to whether or not it makes sense. 26 2 Unit of information Information Measure b i t sPI jb i t sj )(l o g 2, ??n a t sPI jn a t sj )l n (, ??H a r t l e yPI jh a r t l e yj )(l o g 10, ??1) Bits,Nats and Hartley 27 2 Unit of information Information Measure 2) Relationship between Bit,Nat and Hartley )2/ ( l o g)2/ ( l n10,har t l e yjnat sjbi tjIII??28 1) Definition If there are m possible different source message in a digital source(m is finite), and Pj is the probabiltiy of sending the jth message, the entropy (also called average information) is: Information Measure bi t sPPIPHmj jjmjjj ??????121)1(l o g3. Entropy for Digital Source 29 4 Example11: In a string of 12 symbols, where each symbol consistis of one of four levels, there are 412 different binations (words), since each levels is equally likely, and all the different words equally likely. Thus Pj=(1/4)12, And Ij=log2(1/(1/4)12)=12log2(4)=24bits Information Measure 30 5 Source Rate Information Measure Source Rate is defined as bi tsTHR ?Example: A12 A telephone touchtone keypad has the digits 0 to 9 with probability of sending being each, plus “*” and “” with probability of sending being each. If the keys are pressed at a rate of 2keys/s. pute the data rate for this source. Solution: bi t sPPIPHmj jjmjjj ????