Information measures and their properties; entropy, relative entropy and mutual information. Information source models. Lossless data compression: the Kraft inequality, Shannon-Fano and Huffman codes. Typical sequences, asymptotic equipartition property, lossy source coding. Discrete memoryless channels: capacity, channel coding theorem. The additive Gaussian channel. Source coding under a fidelity constraint: rate distortion function and rate distortion theorem.