Machine Learning with Discriminative Methods
Spring 2015
T/Th 12:30-1:45 in Sitterson 014
Syllabus
(h/t Maxim Raginksy for the web design and excellent notes.)
- Thu. Jan. 8
[ Slides ]
-
Introduction
- Hastie, Tibshirani, Friedman Elements of Statistical Learning (the course textbook) Chapters 1 and 2 BOOK
- Poggio & Smale "The mathematics of learning: dealing with data", Notices of the American Mathematical Society, vol. 50, no. 5, pp. 537-544, 2003.
PDF
- Maxim Raginksy's introduction notes for statistical machine learning:
PDF
- Homework Homework 1 Due Thuday Jan 15.
- Thu. Jan. 15
[ Slides ]
-
Of Machine Learning and Loss
- Kearns & Vazirani "Introduction to Computational Learning Theory" (pages 1-16) (see readings 1).
- Tue. Jan. 20
[ Slides ]
-
PAC Learning and Tail Bounds Intro
- Wikipedia page for Chernoff Bound: Wikipedia
- Read at least first part of Raginsky's introductory notes on tail bounds (pages 1-5) PDF.
- Thu. Jan. 22
[ On Board ]
-
Go over learning and tail bounds.
- see readings from last lecture
- Tue. Jan. 27
[ On Board ]
-
Empirical Risk Minimization
- Raginsky's introductory notes on agnostic model free learning PDF.
- Homework Write up ERM in one page with equations. Bring first draft to class on Thuday Jan. 29.
- Thu. Jan. 29
[ Slides ]
-
Doit 1
- Raginsky's introductory notes on agnostic model free learning PDF.
- Homework Edit your draft of ERM writeup. Take some (of your) data, plot 1-10NN and linear regression, training/validation/test error as a function of training set size. Describe the results and include with your ERM writeup. Due Tue. Jan. 3.
- Tue. Feb. 2
[ Slides ]
-
Linear Models 1
- Hastie et al textbook, skim chapter 3, look over exercises for chapters 2 and 3.
- Thu. Feb. 4
[ Slides ]
-
Linear Models 2
- Hastie et al textbook, read chapters 3 and 4.
- Tue. Feb. 10
[On Board]
-
Reading review
- Think more about feature selection, possibly implement something to test those thoughts.
- Thu. Feb. 12
[On Board]
-
Feature selection review
- Tue. Feb. 17
[in snow]
-
It's full of snow.
- Thu. Feb. 19
[on board]
-
Perceptron and SVMs
- Re-read Section 4.5 in the text.
- Do exercise 4.6, prove that the perceptron converges in a finite number of steps.
- Tue. Feb. 24
[on board]
-
SVM intro, projects
- Thu. Feb. 26
[in snow]
-
It's full of snow.
- Tue. Mar. 3
[slides]
-
Non-linear Classifiers, midterm announcement
- Re-read Chapter 5. Pay attention to the RKHS subsection for optional extra fun.
- Tue. Mar. 17
-
Applying machine learning, midterm handed out
- Take home midterm due before class Thursday on Sakai as single PDF or MS Word file.
- Thu. Mar. 19
-
In class midterm
- Prepare your project description, hand in on Sakai before class on Tuesday. Schedule a time to meet with Alex next week to discuss project.
- Tue. Mar. 24
[on board]
-
Optimization 1
- Thu. Mar. 26
[on board]
-
Optimization 2
- Tue. Mar. 31
[on board]
-
Optimization 3
- Read about structured prediction: Structured Learning... Nowozin & Lampert 2011 (Chapter 6). Be prepared to answer questions on 6.1 and 6.2.
- Other questions: What is the difference between perceptron and structured-perceptron? Is structured prediction convex? Is training a model for structured prediction convex?
- Go through this example of using CVX for a linear SVM.
- Optional reference for convex optimization, especially Chapter 9 Section 2, Convex Optimization by Boyd & Vandenberghe.
- Thu. Apr. 2
[slides]
-
Structured Prediction
- Tue. Apr. 7
[slides]
-
Deep learning 1
- Thu. Apr. 9
[on board]
-
Deep learning 2
- Undergrad tutorial for deep networks includes notes about monitoring convergence.
- Tue. Apr. 14
[slides]
-
Deep learning 3
- Thu. Apr. 16
[slides]
-
Presentations 1
- Thu. Apr. 21
[slides]
-
Presentations 2