🎓 Intended learning outcomes
At the end of this lesson, students are expected to:
- Have become aware of any potential need to revise background material in linear algebra to be able to follow the course’s core material.
- Have familiarized themselves and read up on the basics of manipulating vectors and matrices using python’s numpy library, as well as be able to plot basic scatter plots with matplotlib.
- Refresh background knowledge in probability theory to be able to recall and explain basic concepts in probability theory to model simple probabilistic experiments in terms of probability theory, including defining sample space, event space and appropriate probability measures, probability densities.
- Be familiar with the notation used in this course for probability measures, probability densities, expectation, variance, etc.
1. Linear Algebra Revision
For a concise revision of linear algebra, we suggest you revisit the course notes of your previous course on the topic or chapter 2 of the freely available Mathematics for Machine Learning Book by Marc Deisenroth et al for a summary review: https://mml-book.github.io/book/mml-book.pdf
You should be familiar with the basic notions of linear algebra. If any of the following do not sound familiar, please revise:
- The notion of a linear map between vector spaces: $L:V\to W$, i.e. satisfying $L(\lambda x + \mu y) = \lambda L(x)+\mu L(y)$ for $\lambda, \mu\in \mathbb{R}$ and $x, y\in V$.
- Vector spaces and vectors. We will work real D-dimensional vector spaces in this course, where $V=\mathbb{R}^D$ with a vector in $V$ denoted by $x\in V.$
- The notion of a basis $\{v_1, \ldots, v_D\} \subset V$ of a D-dimensional vector space $V$.
- The notion of a inner product $\langle x, y\rangle \in \mathbb{R}$ for $x, y\in V$ on a D-dimensional vector space $V$
- The notion of an orthonormal basis with respect to an inner product.
- The notion of a matrix $M$ representing a linear map $L:V\to W$ with respect to a choice of basis for $V$ and $W$.
- The notion of Eigen-spaces and Eigen-vectors of a matrix. Diagonalization of a symmetric matrix.
- The determinant of a matrix, $\mathrm{det}(M)$
- The trace of a matrix, $\mathrm{tr}(M)$
- Kernel (also called the null-space) and Image of a linear map and how to compute it. Recall that these are also vector spaces.
- The Rank-Nullity theorem of linear algebra: https://en.wikipedia.org/wiki/Rank–nullity_theorem