Information Theory and its applications in theory of computation

This note covers the following topics: Entropy,
Kraft's inequality, Source coding theorem, conditional entropy, mutual
information, KL-divergence and connections, KL-divergence and Chernoff bounds,
Data processing and Fano's inequalities, Asymptotic Equipartition Property,
Universal source coding: Lempel-Ziv algorithm and proof of its optimality,
Source coding via typical sets and universality, joint typicality and joint AEP,
discrete channels and channel capacity, Proof of Noisy channel coding theorem,
Constructing capacity-achieving codes via concatenation, Polarization, Arikan's
recursive construction of a polarizing invertible transformation, Polar codes
construction, Bregman's theorem, Shearer's Lemma and applications, Source coding
and Graph entropy, Monotone formula lower bounds via graph entropy, Optimal set
Disjointness lower bound and applications, Compression of arbitrary
communication protocols, Parallel repetition of 2-prover 1-round games.

**Author(s):** Venkatesan
Guruswami and Mahdi Cheraghchi

NA Pages