This note explains the
following topics: Shearer's Lemma, Entropy, Relative Entropy, Hypothesis
testing, total variation distance and Pinsker's lemma, Stability in Shearer's
Lemma, Communication Complexity, Set Disjointness, Direct Sum in Communication
Complexity and Internal Information Complexity, Data Structure Lower Bounds via
Communication Complexity, Algorithmic Lovasz Local Lemma, Parallel Repetition
Theorem, Graph Entropy and Sorting.
This book explains basics of thermodynamics, including thermodynamic
potentials, microcanonical and canonical distributions, and evolution in the
phase space, The inevitability of irreversibility, basics of information theory,
applications of information theory, new second law of thermodynamics and quantum
information.
This PDF covers the
following topics related to Information Theory : Introduction, Entropy, Relative
Entropy, and Mutual Information, Asymptotic Equipartition Properties,
Communication and Channel Capacity, Method of Types, Conditional and Joint
Typicality, Lossy Compression & Rate Distortion Theory, Joint Source Channel
Coding.
This PDF covers the following
topics related to Information Theory : Information measures, Lossless data
compression, Binary hypothesis testing, Channel coding, Lossy data compression,
Advanced topics.
This is a graduate-level
introduction to mathematics of information theory. This note will cover both
classical and modern topics, including information entropy, lossless data
compression, binary hypothesis testing, channel coding, and lossy data
compression.