Table of Contents
🎓 Intended learning outcomes
At the end of this lesson, students are expected to:
- Motivate classification from a geometric perspective
- Derive the perceptron from basic principles and apply it
- Understand and motivate margin classifiers
- Formulate a constrained optimization problem for margin classifiers
- Define equality constraints, inequality constraints, and feasible region
- Describe the need and implementation of slack variables
- Derive a soft-margin linear SVM formulation from two perspectives
- Motivate and identify regularization terms in the average loss
- Understand and implement subgradients, including the Pegasos algorithm
- Be able to define and explain square loss, logistic loss, hinge loss, and zero-one loss
📐 A geometric perspective ★☆☆