Math 567: Introduction to Coding Theory: Syllabus

Text (required): S. Roman, An Introduction to Coding and Information Theory, Springer-Verlag: New York 1992

This syllabus is for Winter Term 2014.


#

date

Topics covered; Sections in Roman, Coding Theory

HW due dates


1

Thur. 1/9

1.0 History: Shannon, Bell Labs, 1.1 Overview of Shannon theory; Channels: Source coding and Channel coding,

2

Tues. 1/14

2.1 General coding problem: Source and channel coding, 2.2 Motivating Example: Hamming [7,4, 3]- linear code; 

2.3 Error probabilities for Hamming code, 2.4 Information sources

3

Thur. 1/16

3.1 Discrete probability distributions, bivariate distributions, 3.2 joint probabilities, marginal probabilities, conditional probabilities, 

3.3 Weatherman example

4

Tues. 1/21

4.0 Entropy of source [Sect. 1.1], 4.1 Basic entropy inequalities [Sect. 1.2] 

4.2 Binomial coefficients and entropy [Sect. 1.2]

HW#1 due Tues. 1/21


5

Thur. 1/23

5.0 Entropy properties (cont'd) [Sec. 1.2], 5.1 Binomial coefficients and entropy (cont'd), 

5.2 Shape of binomial distribution, 5.3 Entropy as an expected value [Sec. 1.3]


[5b]

Tues. 1/28

Class Cancelled-Weather

6

Thur. 1/30

6.0 Convexity down of entropy function [Sec. 1.3], 6.1 Source coding: compression [Chap. 2], 

6.2 Variable length codes, uniquely decipherable codes [Sec. 2.1], 6.3 Kraft Inequality Theorem (necessity) [Sec. 2.1]

7

Tues. 2/4

7.0 Instantaneous codes [Sec. 2.1], 7.1 Prefix tree for prefix code, 7.2 Kraft's theorem (sufficiency), 

7.3 Entropy lower bound for average length [Sec 2.3], 7.4 Jensen's convexity inequality (started)

HW#2 due Tues. 2/4


8

Thur. 2/6

8.0 Jensen's convexity inequality (handout), 8.0b Noiseless codes and trees, 

8.1 Huffman coding (N=2) example [Sec. 2.2], 8.2 Huffman coding algorithm (Thm. 2.2.2), proof by trees

9

Tues. 2/11

9.0 Huffmand code, base 3 example [Sec. 2.2], 9.1 Noiseless coding theorem [Sec. 2.3], 

9.2 Extensions of source thm [Sec. 2.3], 9.3 Lempel-Ziv encoding [with handout],

10

Thur. 2/13

10.0 Source coding [Noiseless coding]-review of Chap. 2, 10.1 Discrete memoryless channel, channel matrix [Sec. 3.1] 

10.2 Channel distribution: conditional entropy, 10.3 Mutual information I(X; Y); channel capacity [Sec. 3.2], 

10.4 Types of channels [Sec. 3.1]

11

Tues. 2/18

11.0 Channels and three probability matrices, 11.0b Conditional entropy-review,

11.1 Types of channels [Sec. 3.1]: A. Row symm., column symm., symmetric C. lossless, D. noiseless, E. deterministic, 

11.2 Channel capacity [Sec. 3.2], Lossless capacity, deterministic capacity, 

11.3 Symmetric channel entropy theorem 3.2.3, capacity of binary symmetric channel

12

Thur. 2/20

12.0 Channel capacity-review Sec. 3.2, 12.0b Lossless, noiseless channel, 

12.1 Block codes and their rates [Sec. 3.3], 12.2 Channel transmission and decoding, 12.3 Decoder performance, 

12.4 Ideal observer decoder [Thm. 3.3.1, 3.3.2], 12.5 Noisy coding theorem statement

13

Tues. 2/25

13.0 Row Symmetric channel-review [Sec. 3.1] 13.1 Block codes [Sec. 3.3] 13.2 Decision scheme, error rate, 

13.3 Maximum likelihood decoding

HW#3 due Tues. 2/25

14

Thur. 2/27

14.0 Midterm Exam (in class) [Modified take-home exam, distributed Tues. 2/25]


3/1-3/8

Winter Recess


15

Tues. 3/11

15.1 Noisy coding theorem statement [Sect. 3.3], 15.2 Fano's Inequality [Sect. 3.3], 

15.3 Weak Converse noisy coding theorem [Sect 3.4]

16

Thurs. 3/13

16.1 Noisy Coding Theorem proof for BSC [Sect. 3.4] 16.2 Hamming balls,

16.3 Reduction max error to average error, 16.4 Proof of BSC Noisy coding theorem-average error case

17

Tues. 3/18

17.0a Fano inequality handout [Sect 3.3], 17.0b Noisy coding theorem review [Sect. 3.4] 

17.1 Minimum distance decoding [Sect. 4.2],

17.2 Error detection and correction, 17.3 Packing and covering radii, 17.4 Nonlinear code parameters

18

Thur. 3/20

18.0 General Existence Problems for Codes; [Sect. 4.2] 18.1 Maximal codes, maximal distance decoding [Sect. 4.2] 

18.2 Perfect codes, 18.3 Finite fields versus finite rings, 18.4 Equivalence of codes


19

Tues 3/25

19.0 Scalar multiple equivalence of codes [Sect. 4.3], 19.1 Linear codes, 19.2 cyclic codes, 

19.3 New codes from old ; A. Extending a code, B. Puncturing a code, C. Expunging a code, 

19.3 D. Augmenting a code, E. Shortening a code, F. Direct sum construction

20

Thur. 3/27

20.1 New codes from old. E. (u, u+v) construction [Sect. 4.3], 

20.2 Linear codes: generator matrices [Sect. 5.1] 

20.3 Dual of linear code: dual generator matrix

21

Tues. 4/1

21.0 Linear Codes, Dual Codes, [Sect. 5.1] 21.1 Parity Check Matrix and Minimal distance, 

21.2 Gilbert-Varshamov lower bound, 21.3 Syndrome decoding of linear code 

HW#4 due Tues. 4/1

22

Thur. 4/3

22.1 Hamming codes [Sect. 6.1], 22.2 Simplex Codes [Sect. 6.1], 

22.3 Majority logic decoding [Sect. 5.1], 22.4 Burst Errors [Sect. 5.1]

23

Tues. 4/8

23.0 Burst errors [Sect 5.1], 23.1 Binary Golay code G_{24} [Sect. 6.1],

23.2 Upper Bounds for nonlinear codes [Sect. 4.5], 23.3 Singleton Bound, 

23.4 Sphere-Packing Bound, 23.5 Plotkin Bound, 23.6 Example [n, d] = [8,5] for q=2.

24

Thur. 4/10

24.0 Homework 4 review, 24.1 Asymptotic bounds for codes [Sect. 4.5], 

24.2 Weight enumerators [Sect. 5.2], 24.3 MacWilliams identities (stated)

25

Tues. 4/15

(Harm Derksen) 25.1 MacWilliams Identity I [Sect. 5.2], 

25.2 Krawtchouk Polynomials, 25.3 MacWilliams Identity II, 25.3 Characters on F_q and V(n, F_q), 


26

Thur. 4/17

26.1 Weight enumerators of Perfect codes; Binary Golay code G_{23]. 

26.2 Characters and Linear Codes.

26.3 Proof of MacWilliams I  identity 

HW#5 due Thur. 4/17

27

Tues. 4/22

27.0 Information theory versus coding theory; entropy of source, of channel, of

coding and decoding, 27.1 MacWilliams identities: applications, 

27.2 Singleton bound [Sect. 5.3], 27.3 Minimum distance separable codes 

28

Fri. 4/25

Final Exam (4:00pm-6:00pm)