This module basically covered simple linear regression and multiple linear regression with different model assumptions and developed some theory about their estimators. Dr Wang’s lecture notes and some tutorial questions were directly lifted from the reference textbook corresponding to contents of Chapters 2, 3, 4, 5, 6, 7, 8, 12, 13 and 14. Mathematical proofs of the various results were often given in much less details than the textbook, for example the Full-Reduced model test was given without much context, and she speaks with a pretty strong Chinese accent. Dr Wang is a pretty nice lecturer and tries to engage the class but sitting through 2 hours of lecture consisting of her chanting out the notes wasn’t a good use of my time, I could probably read the textbook in much less time and in greater detail in my free time.
One needs to be comfortable with linear algebra to do well in this module, the proofs and tutorial questions are not difficult but it required some mathematical maturity to produce them. This proved to be difficult for most stats majors since they haven’t been doing linear algebra nor proving-intensive modules for the past 3 years. I intentionally took this module to leverage on my math background and because it complemented MA4230 Matrix Computation pretty well.
Head over to our Shop for more module content!