Computer Science BooksInformation Theory Books

Basics of information theory

Advertisement

Basics of information theory

Basics of information theory

This book explains basics of thermodynamics, including thermodynamic potentials, microcanonical and canonical distributions, and evolution in the phase space, The inevitability of irreversibility, basics of information theory, applications of information theory, new second law of thermodynamics and quantum information.

Author(s):

s165 Pages
Similar Books
Lecture Notes on Information Theory by Prof. Dr. rer. nat. Rudolf Mathar

Lecture Notes on Information Theory by Prof. Dr. rer. nat. Rudolf Mathar

This lecture note covers introduction, Fundamentals of Information Theory, Source Coding and Information Channels.

s59 Pages
Basics of information theory

Basics of information theory

This book explains basics of thermodynamics, including thermodynamic potentials, microcanonical and canonical distributions, and evolution in the phase space, The inevitability of irreversibility, basics of information theory, applications of information theory, new second law of thermodynamics and quantum information.

s165 Pages
Information Theory and Coding cam

Information Theory and Coding cam

The PDF covers the following topics related to Information Theory : Foundations: probability, uncertainty, information, Entropies defined, and why they are measures of information, Source coding theorem; prefix, variable-, and fixed-length codes, Channel types, properties, noise, and channel capacity, Continuous information, density, noisy channel coding theorem, Fourier series, convergence, orthogonal representation, Useful Fourier theorems, transform pairs, Sampling, aliasing, Discrete Fourier transform, Fast Fourier Transform Algorithms, The quantised degrees-of-freedom in a continuous signal, Gabor-Heisenberg-Weyl uncertainty relation, Kolmogorov complexity.

s75 Pages
Information Theory in Computer Science

Information Theory in Computer Science

This note explains the following topics: Shearer's Lemma, Entropy, Relative Entropy, Hypothesis testing, total variation distance and Pinsker's lemma, Stability in Shearer's Lemma, Communication Complexity, Set Disjointness, Direct Sum in Communication Complexity and Internal Information Complexity, Data Structure Lower Bounds via Communication Complexity, Algorithmic Lovasz Local Lemma, Parallel Repetition Theorem, Graph Entropy and Sorting.

sNA Pages
Information Theory, Inference, and Learning Algorithms (David J.C. MacKay)

Information Theory, Inference, and Learning Algorithms (David J.C. MacKay)

Currently this section contains no detailed description for the page, will update this page soon.

s Pages
A Short Course in Information Theory (D. MacKay)

A Short Course in Information Theory (D. MacKay)

Currently this section contains no detailed description for the page, will update this page soon.

s Pages

Advertisement