- Information source model
- Measure of the information: entropy of a discrete random variable
- Entropy of an information source
- Conditional Entropy
- Source Coding Theorem (I Shannon Theorem)
- Huffman Coding
- Lempel-Ziv Coding

** Reference : Proakis Salehi (II ed.) Cap.4**

** Examples of information sources:**

Audio-broadcasting System : speech signal

Video-broadcasting System : video signal

Fax transmission system: monochromatic image

Communication System among computers : ASCII symbol

Sequence or sequence of binary symbols

How much memory amount can we save at most?

**I Shannon Theorem gives us the answer**

The lowest number of symbols to represent “**without distortion**” (say, distortion-less or noiseless) every source symbol is given by the average amount of information carried by any source symbol.

*5*. Digital Transmission over AWGN chanel

*6*. Evaluation of P(e) for optimum RX in AWGN

*7*. Error probability for M-PSK

*8*. Noncoherent ML Demodulation of FSK signals in AWGN

*9*. Transmission through bandlimited AWGN channels

*13*. Cyclic Codes

Proakis Salehi (II ed.) Cap.4

Progetto "Campus Virtuale" dell'Università degli Studi di Napoli Federico II, realizzato con il cofinanziamento dell'Unione europea. Asse V - Società dell'informazione - Obiettivo Operativo 5.1 e-Government ed e-Inclusion