Table of Contents
π Intended learning outcomes
At the end of this lesson, students are expected to:
- Understand and be able to formally define classification and regression problems
- Motivate logistic regression from a probabilistic perspective
- Formulate binary cross entropy loss from first principles
- Understand and implement gradient descent (GD)
- Be able to apply gradient descent to solve a classification problem
- Identify the critical settings for gradient descent to succeed/fail
- Motivate, understand, and derive 2$^{nd}$ order gradient methods
- Motivate, derive, and critically understand stochastic gradient descent (SGD)
- Explain and be able to implement momentum in iterative optimizers
πΊοΈ Motivation: Living in a Binary World β
ββ
In the last lecture we fitted a line to two points, and then to a set of points. In each case, we were attempting to predict a continuous value, temperature. These types of problems where we predict continuous values are called regression problems, a subtype of supervised learning. Now let us consider another fundamental problem in machine learning: classification.