【正文】
in b. grasping the important concept c. understanding the basic munication system by deriving some key formula d. simulation using Matlab The property of the book and methods of learning it 23 1. Information Content If the probabiltiy of transmitting jth message is Pj, the information content sent from a digital source when it was transmitted is given by: Information Measure bitPPI jjj )(log)1(log22 ???24 Information Measure From the definition, we have the following results: 1) The message that are less likely to occur provide more information 2) Information content depends on only the likehood of sending the message and does not depend on possible interpretation of the content as to whether or not it makes sense. 25 2 Unit of information Information Measure bitsPI jbi tsj )(log 2, ?? natsPIjna tsj )ln(, ?? Hartl eyPI jhar tl eyj )(log 10, ??1) Bits,Nats and Hartley 26 2 Unit of information Information Measure 2) Relationship between Bit,Nat and Hartley )2/(l og)2/(l n10,h art le yjn atsjb itjIII??27 1) Definition 2) If there are m possible different source message in a digital source(m is finite), and Pj is the probabiltiy of sending the jth message, the entropy (also called average information) is: Information Measure bitsPPIPHmj jjmjjj ??????121)1(log3. Entropy for Digital Source 28 4 Example11: In a string of 12 symbols, where each symbol consistis of one of four levels, there are 412 different binations (words), since each levels is equally likely, and all the different words equally likely. Thus Pj=(1/4)12, And Ij=log2(1/(1/4)12)=12log2(4)=24bits Information Measure 29 5 Source Rate Information Measure Source Rate is defined as bitsTHR ?Example: A12 A telephone touchtone keypad has the digits 0 to 9 with probability of sending being each, plus “*” and “” with probability of sending being each. If the keys are pressed at a rate