Prerequisite: background in algorithms, linear algebra, calculus, probability, and statistics (CS/CNS/EE/NB 154 or CS/CNS/EE 156a or instructorâ€™s permission)
This course will cover popular methods in machine learning and data mining, with an emphasis on developing a working understanding of how to apply these methods in practice. This course will also cover core foundational concepts underpinning and motivating modern machine learning and data mining approaches. This course will also cover some recent research developments.
Assignments will be due at 9pm on Friday via Moodle. Students are allowed to use up to 48 late hours. Late hours must be used in units of hours. Specify the number of hours used when turning in the assignment. Late hours cannot be used on the final exam. There will be no TA support over the weekends.
Homeworks: (taken from CS 1) It is common for students to discuss ideas for the homework assignments. When you are helping another student with their homework, you are acting as an unofficial teaching assistant, and thus must behave like one. Do not just answer the question or dictate the code to others. If you just give them your solution or code, you are violating the Honor Code. As a way of clarifying how you can help and/or discuss ideas with other students (especially when it comes to coding and proofs), we want you to obey the "50 foot rule". This rule states that your own solution should be at least 50 feet away. If you are helping another students but cannot without consulting your solution, don't help them, and refer them instead to a teaching assistant.
Miniprojects: Students are allowed to collaborate fully within their miniproject teams, but no collaboration is allowed between teams.
Final Exam: No collaboration of any kind is allowed.
Yisong Yue yyue@caltech.edu
Ellen Feldman  efeldman@caltech.edu 
Nishanth Bhaskara  nbhaskar@caltech.edu 
Rohan Choudhury  rchoudhury@caltech.edu 
Julia Deacon  jcdeacon@caltech.edu 
Katherine Guo  kguo@caltech.edu 
Michael Hashe  mhashe@caltech.edu 
Joey Hong  jhhong@caltech.edu 
Andrew Kang  akang@caltech.edu 
Catherine Ma  cmma@caltech.edu 
Ruoqi Shen  rshen@caltech.edu 
Richard Zhu  lzhu@caltech.edu 
Vincent Zhuang  vzhuang@caltech.edu 
Note: schedule is subject to change.
Further Reading:  
1/04/2017  Lecture:  Administrivia, Basics, Bias/Variance, Overfitting  [slides]  
1/04/2017  Recitation:  Introduction to Python for Machine Learning  [slides][iPython]  
1/09/2017  Lecture:  Perceptron, Gradient Descent  [slides]  Daume Chapter 3 Mistake Bounds for Perceptron [link] AdaGrad [link] Stochastic Gradient Descent Tricks [link] Bubeck Chaper 3 

1/11/2017  Lecture:  SVMs, Logistic Regression, Neural Nets, Loss Functions, Evaluation Metrics  [slides]  Bounds on Error Expectation for SVMs [link]  
1/11/2017  Recitation:  Linear Algebra  [slides]  The Matrix Cookbook [link]  
1/16/2017  Lecture:  Regularization, Lasso  [slides]  Murphy 13.3  
1/18/2017  Lecture:  Decision Trees, Bagging, Random Forests  [slides]  Overview of Decision Trees [pdf] Overview of Bagging [pdf] Overview of Random Forests [pdf] 

1/23/2017  Lecture:  Boosting, Ensemble Selection  [slides]  Shapire's Overview of Boosting [pdf]  
1/25/2017  Lecture:  Deep Learning  [slides]  Deep Learning Book [html] A Brief Overview of Deep Learning. [link] 

1/25/2017  Recitation:  Keras Tutorial  [slides]  [link]  
1/30/2017  Lecture:  Deep Learning Part 2  [slides]  
2/1/2017  Lecture:  Recent Applications: Edge Detection & Speech Animation  [slides]  
2/6/2017  Lecture:  Unsupervised Learning, Clustering, Dimensionality Reduction  [slides]  
2/8/2017  Lecture:  Latent Factor Models, NonNegative Matrix Factorization  [slides]  Original Netflix Paper [link]  
2/13/2017  Lecture:  Embeddings  [slides]  Locally Linear Embedding [link] Playlist Embedding [link] word2vec [link] 

2/15/2017  Lecture:  Recent Applications: Representation Learning  [slides]  [paper 1] [paper 2] [paper 3] 

2/20/2017  Lecture:  Probabilistic Models, Naive Bayes  [slides]  Murphy 3.5  
2/20/2017  Recitation:  *TUESDAY* Probability & Sampling (ANB 121)  [slides]  
2/22/2017  Lecture:  Hidden Markov Models  [slides] [notes]  Murphy 17.317.5  
2/27/2017  Lecture:  Hidden Markov Models Part 2  
2/27/2017  Recitation:  *TUESDAY* Dynamic Programming  [slides]  
3/1/2017  Lecture:  Recent Applications: Deep Generative Models  [slides]  
3/6/2017  Lecture:  Survey of Advanced Topics  [slides]  
3/8/2017  Lecture:  Review & Q/A 