PART A INFORMATION THEORY AND SOURCE CODING
1. Probability, Random Processes, and Noise
1.2 Fundamentals of Probability
1.2.3 Elementary Theorems on Probability
1.3 Random Variables and Its Characteristics
1.3.1 Discrete Random Variable and Probability Mass Function
1.3.2 Cumulative Distribution Function
1.3.3 Distribution Function for Discrete Random Variable
1.3.4 Continuous Random Variable and Probability Density Function
1.5 Frequently Used Probability Distributions
1.7.5 Flicker Noise or 1/f Noise
2.5.1 Discrete Memoryless Channel
2.6 Joint Entropy and Conditional Entropy
2.10.2 Additive White Gaussian Noise Channel
3.6.1 Image Formats, Containers, and Compression Standards
3.13 MPEG Audio and Video Coding Standards
3.14 Psychoacoustic Model of Human Hearing
3.14.3 Perceptual Coding in MPEG Audio
3.16 Linear Predictive Coding Model
4.4.1 Throughput Efficiency of ARQ
4.5.5 Arithmetic of Binary Field
5.4.1 Undetectable Error Pattern
5.7 Error-detecting Capability
5.8 Error-correcting Capability
5.9 Standard Array and Syndrome Decoding
5.10 Probability of Undetected Errors Over a BSC
6.2.1 Generation and Parity-check Matrices
6.2.2 Realization of Cyclic Code
6.3 Syndrome Computation and Error Detection
7.6 Implementation of Galois Field
7.7 Implementation of Error Correction
7.7.2 Computation of Error Location Polynomial
8.7 Implementation and Modification
8.10 Interleaving Techniques: Block and Convolution
8.11 Coding and Interleaving Applied to CD Digital Audio System
8.11.1 CIRC Encoding and Decoding
8.11.2 Interpolation and Muting
9.2 Plain Text, Cipher Text, and Key
9.3 Substitution and Transposition
9.5 Symmetric-key Cryptography
9.5.1 Stream Ciphers and Block Ciphers
9.6.4 Inverse Initial Permutation
9.8 Asymmetric-key Cryptography
9.10 Symmetric versus Asymmetric-key Cryptography
9.11 Diffie–Hellman Key Exchange