Information Theory and its applications in theory of computation
Information Theory and its applications in theory of computation
Information Theory and its applications in theory of computation
This note covers the following topics: Entropy,
Kraft's inequality, Source coding theorem, conditional entropy, mutual
information, KL-divergence and connections, KL-divergence and Chernoff bounds,
Data processing and Fano's inequalities, Asymptotic Equipartition Property,
Universal source coding: Lempel-Ziv algorithm and proof of its optimality,
Source coding via typical sets and universality, joint typicality and joint AEP,
discrete channels and channel capacity, Proof of Noisy channel coding theorem,
Constructing capacity-achieving codes via concatenation, Polarization, Arikan's
recursive construction of a polarizing invertible transformation, Polar codes
construction, Bregman's theorem, Shearer's Lemma and applications, Source coding
and Graph entropy, Monotone formula lower bounds via graph entropy, Optimal set
Disjointness lower bound and applications, Compression of arbitrary
communication protocols, Parallel repetition of 2-prover 1-round games.
Author(s): Venkatesan
Guruswami and Mahdi Cheraghchi
This note serves as a comprehensive guide to fundamental concepts in
information theory and coding. This pdf provides discrete probability theory,
information theory, and coding principles. Beginning with Shannon's measure of
information, then delves into the efficient coding of information, the
methodology of typical sequences is introduced, emphasizing the distinction
between lossy and lossless source encoding. The text also discusses coding for
noisy digital channels, block coding principles and tree and trellis coding
principles.
This
lecture note navigates through information theory, statistics and measure theory. It
covers fundamental concepts such as definitions, chain rules, data processing
inequalities, and divergences and extends to optimal procedures, LeCam’s and
Fano’s inequalities, and operational results like entropy and source coding. It
also focus on exponential families and statistical modeling, fitting procedures,
and lower bounds on testing parameters, sub-Gaussian and sub-exponential random
variables, martingale methods, uniformity covering topics such as
Kullback-Leibler divergence, PAC-Bayes bounds, interactive data analysis, and
error bounds.
This book contains following contents: Information Theory
for Discrete Variables, Information Theory for Continuous Variables, Channel
Coding, Typical Sequences and Sets, Lossy Source Coding, Distributed Source
Coding, Multiaccess Channels.
This book explains basics of thermodynamics, including thermodynamic
potentials, microcanonical and canonical distributions, and evolution in the
phase space, The inevitability of irreversibility, basics of information theory,
applications of information theory, new second law of thermodynamics and quantum
information.
The PDF covers the following topics related to Information
Theory : Information Theory for Data Communications and Processing, On the
Information Bottleneck Problems: Models, Connections,Applications and
Information Theoretic Views, Variational Information Bottleneck for Unsupervised
Clustering: Deep Gaussian Mixture Embedding, Asymptotic Rate-Distortion Analysis
of Symmetric Remote Gaussian Source Coding: Centralized Encoding vs. Distributed
Encoding, Non-Orthogonal eMBB-URLLC Radio Access for Cloud Radio Access Networks
with Analog Fronthauling, Robust Baseband Compression Against Congestion in
Packet-Based Fronthaul Networks Using Multiple Description Coding, Amplitude
Constrained MIMO Channels: Properties of Optimal Input Distributions and Bounds
on the Capacity, Quasi-Concavity for Gaussian Multicast Relay Channels, Gaussian
Multiple Access Channels with One-Bit Quantizer at the Receiver, Efficient
Algorithms for Coded Multicasting in Heterogeneous Caching Networks,
Cross-Entropy Method for Content Placement and User Association in Cache-Enabled
Coordinated Ultra-Dense Networks, Symmetry, Outer Bounds, and Code
Constructions: A Computer-Aided Investigation on the Fundamental Limits of
Caching.
The PDF covers the following topics
related to Information Theory : Foundations: probability, uncertainty,
information, Entropies defined, and why they are measures of information, Source
coding theorem; prefix, variable-, and fixed-length codes, Channel types,
properties, noise, and channel capacity, Continuous information, density, noisy
channel coding theorem, Fourier series, convergence, orthogonal representation,
Useful Fourier theorems, transform pairs, Sampling, aliasing, Discrete Fourier
transform, Fast Fourier Transform Algorithms, The quantised degrees-of-freedom
in a continuous signal, Gabor-Heisenberg-Weyl uncertainty relation, Kolmogorov
complexity.