Computer Science BooksInformation Theory Books

Information Theory and its applications in theory of computation

Advertisement

Information Theory and its applications in theory of computation

Information Theory and its applications in theory of computation

This note covers the following topics: Entropy, Kraft's inequality, Source coding theorem, conditional entropy, mutual information, KL-divergence and connections, KL-divergence and Chernoff bounds, Data processing and Fano's inequalities, Asymptotic Equipartition Property, Universal source coding: Lempel-Ziv algorithm and proof of its optimality, Source coding via typical sets and universality, joint typicality and joint AEP, discrete channels and channel capacity, Proof of Noisy channel coding theorem, Constructing capacity-achieving codes via concatenation, Polarization, Arikan's recursive construction of a polarizing invertible transformation, Polar codes construction, Bregman's theorem, Shearer's Lemma and applications, Source coding and Graph entropy, Monotone formula lower bounds via graph entropy, Optimal set Disjointness lower bound and applications, Compression of arbitrary communication protocols, Parallel repetition of 2-prover 1-round games.

Author(s):

sNA Pages
Similar Books
Information Theory in Computer Science

Information Theory in Computer Science

This note explains the following topics: Shearer's Lemma, Entropy, Relative Entropy, Hypothesis testing, total variation distance and Pinsker's lemma, Stability in Shearer's Lemma, Communication Complexity, Set Disjointness, Direct Sum in Communication Complexity and Internal Information Complexity, Data Structure Lower Bounds via Communication Complexity, Algorithmic Lovasz Local Lemma, Parallel Repetition Theorem, Graph Entropy and Sorting.

sNA Pages
A Short Course in Information Theory (D. MacKay)

A Short Course in Information Theory (D. MacKay)

Currently this section contains no detailed description for the page, will update this page soon.

s Pages

Advertisement