# Free Information Theory Books

 Artificial Intelligence Compiler Design Computation Theory Computer Algorithm Computer Architecture Computer Graphics Functional Programming Information Theory Numerical Computation OOD/OOP Operating System Programming Theory

### This section contains free e-books and guides on Information Theory, some of the resources in this section can be viewed online and some of them can be downloaded.

Information Theory Books
 Information Theory Lecture NotesProf. Yury PolyanskiOnline | NA Pages | EnglishThis is a graduate-level introduction to mathematics of information theory. This note will cover both classical and modern topics, including information entropy, lossless data compression, binary hypothesis testing, channel coding, and lossy data compression. Lecture notes on Information Theory and CodingUniversity of SienaOnline | NA Pages | EnglishThis note explains the following topics: Measuring Information, Joint Entropy, Relative Entropy and Mutual Information, Sources with Memory, Asymptotic Equipartition Property and Source Coding, Channel Capacity and Coding, Continuous Sources and Gaussian Channel, Rate Distortion Theory. Information Theory by Himanshu TyagiHimanshu TyagiOnline | NA Pages | EnglishThis note covers the following topics: Introduction to Information theory, a simple data compression problem, transmission of two messages over a noisy channel, measures of information and their properties, Source and Channel coding, Data compression, transmission over noisy channels, Differential entropy, Rate-distortion theory. Lecture Notes On Information TheoryYihong WuPDF | 342 Pages | EnglishThese notes provide a graduate-level introduction to the mathematics of Information Theory. Topics covered includes: Information measures: entropy, divergence and mutual information, Sufficient statistic, Extremization of mutual information, Lossless data compression, Channel coding, Linear codes, Lossy data compression, Applications to statistical decision theory, Multiple-access channel, Entropy method in combinatorics and geometry. Information Theory by Yao XieYao XieOnline | NA Pages | EnglishThis note will explore the basic concepts of information theory. It is highly recommended for students planning to delve into the fields of communications, data compression, and statistical signal processing. Topics covered includes: Entropy and mutual information, Chain rules and inequalities, Data processing, Fano's inequality, Asymptotic equipartition property, Entropy rate, Source coding and Kraft inequality, Optimal code length and roof code, Huffman codes, Shannon-Fano-Elias and arithmetic codes, Maximum entropy, Channel capacity, Channel coding theorem, Differential entropy, Gaussian channel, Parallel Gaussian channel and water-filling, Quantization and rate-distortion. Information Theory, Inference, and Learning Algorithms (David J.C. MacKay) A Discipline Independent Definition of Information (Robert M. Losee) An Introduction to Information Theory and Entropy (Tom Carter) A Short Course in Information Theory (D. MacKay)