I rate this module as inspiring and difficult. It is the first time when dr. Aggarwal taught this module, so both the class and the lecturer need to spend time to get familiar with each other. He adjusted his pace along with our performance through this semester. The pace was slowing down towards the end of the semester. He is very helpful.
On the other hand, the lecturer only uploaded his handwritten notes (but there is a downloadable book to follow) to IVLE, which is not very reader-friendly, because the scanned notes was not clear and the orientation is wrong. He does not use slides and his writing on the whiteboard is smaller than most lecturers, so I suggest you sit in front. If dr. Aggarwal saw this post, I would suggest him learn from prof. Gan Wee Teck and prof. Dilip from our maths department. Both of them write excellent notes on the board.
The content is about the information theory (of course…). We covered the proof for the shannon’s source coding and channel coding theorem, the proof for RS code and some other related things such as information theoretic cryptography and randomness extractors. The textbook is freely downloadable, “Information Theory, Inference, and Learning Algorithms”. You may refer to that.
The course requires good understanding in basic algorithm analysis and statistics, including probability of multiple random variables, various probability bounds for extreme values(markov, chernoff…) and union bound. It is DANGEROUS to take this module without such knowledge.
Expected grade: A
Actual grade: A
Head over to our Shop for more module content!