CS181: Machine Learning

Finale Doshi-Velez, Harvard University

Time: Mon/Wed 9:00-10:30am

Location: Northwest Building B103

Announcements (check piazza too)

  • Math Section on Tuesday January 29th 3-4pm, room: MD G125.
  • All the links on (this) front page are now up to date. You can use them to navigate to the Piazza site and the Canvas site, as well as see the repo with the first homework (below). If you need access to Canvas, please put in a private Piazza post (Piazza shouldn't require special access). The schedule page is still being updated.

Course Info

Section Times
Regular section times:
  • Monday 1:30-2:30pm in MD 123, 4:30-5:30pm in Science Center 109
  • Tuesday 10:30am-11:30am in Science Center B10, 3:00pm-4:00pm in Science Center 222
  • Wednesday 4:30pm-5:30pm in Science Center 222
Office Hours
  • Finale's OH: Wednesday 3-4:30pm in Science Center 316.04
  • Tue 8-10pm: Quincy Dining Hall
  • Wed 6-8pm: MD 119
  • Wed 8-10pm: Eliot Dining Hall
  • Thu 7-9pm: Mather Dining Hall
  • Thu 8-10pm: Currier Dining Hall (Quad)
  • Thu 9-11pm: Leverett Dining Hall
  • Fri 10-Noon: MD 1st floor lobby (one floor up from ground)
Syllabus and Collaboration Policy
  • (UPDATED 1/29/2019!) See the course syllabus
Links
References
Grading
  • 5 Theory Homeworks (30%)
  • 4 Practicals (30%)
  • 2 Midterms (40%)
Other Courses
  • When contacting staff, Piazza and in-person is preferred. Please use email sparingly.
  • Instructor
    • Finale Doshi-Velez
    Teaching Fellows
    ethiCS:
    Date Topic Subtopic Section Demos ReadingsAssignment
    0. Math Review
    Jan. 28 (M) Overview Intro T1 Regression
    Jan. 30 (W) Regression Linear Regression 1 Regression Bishop § 3.1, Sklearn § 3.1, 18.10
    Feb. 1 (F)
    1. Linear Reg (sol)
    Feb. 4 (M) Prob. Reg. (Finale|slides) Gaussian
    BasisRegression
    Bishop § 2.3, 3.1 P1 Regression
    Feb. 6 (W) Model Selection (Finale|slides)
    Feb. 8 (F) T1 Regression (submit) due
    2. Model Selection/Bayes (sol) | Cross Validation Demo Sklearn
    Feb. 11 (M) Bayesian LR / Model Selection (Finale|slides) Bishop § 3.3
    Feb. 13 (W) Classification Linear Classification (Finale|Slides) Perceptron Bishop § 4.1, Sklearn § 15.9
    Feb. 15 (F) P1 Regression due
    3. Bayes LR, G. Descent, Lin. Class. (Sol) | G. Descent, One-hot Demo
    Feb. 18 (M) President's Day Probabilistic Classification T2 Classification
    Feb. 20 (W) Prob. Classification (Finale|Slides) Probabilistic Classification Bishop § 4.2, 4.3, Sklearn § 18.1, 29.24
    Feb. 22 (F)
    4. Prob. Class / NN (sol)
    Feb. 25 (M) Neural Net 1 (Finale | Slides) Neural Networks 1
    TF Playground
    Bishop § 5.1-5.2 P2 Classification released
    Feb. 27 (W) Neural Net 2 (Finale|Slides) ConvNet JS Bishop § 5.3
    Mar. 1 (F) T2 Classification (submit) due
    Midterm 1 Review (topics|problems|sol)
    Mar. 4 (M) Margin Max-Margin (Finale|Slides) Bishop § 7.1
    Mar. 6 (W) SVMs (Finale|Slides) Bishop § 7.1
    Mar. 8 (F) P2 Classification due
    5. Margin-based / SVMs ( sol )
    Mar. 11 (M) Ethics in ML (Slides)
    Mar. 13 (W) Midterm 1
    Mar. 15 (F)
    Mar. 18 (M) Spring Break No class T3 SVMs released (but only needs 1 wk)
    Mar. 20 (W) No class
    Mar. 22 (F)
    7. K-means / HAC (sol)
    Mar. 25 (M) Unsupervised Learning Clustering (Finale|Slides) Bishop § 9.1
    Mar. 27 (W) Mixture/EM (Finale|Slides) Bishop § 9.2,9.3
    Mar. 29 (F) T3 SVMs due
    8. Mixture/EM ( sol )
    Apr. 1 (M) PCA T4 Clustering, P3 Semi-supervised released
    Apr. 3 (W) PGMs Topic Models Topic Modeling Bishop § 9.3.3,
    Introduction to Probabilistic Topic Models (optional)
    Apr. 5 (F)
    9. Dim Reduction / Bayes Nets (sol)
    Apr. 8 (M) Graphical Models Bishop § 8.1, 8.2 T4 ( submit ) due
    Apr. 10 (W) Linear graphical models T5 Prob Modeling / Bayes Nets released
    Apr. 12 (F) P3 due
    10. Inference, HMMs, Kalman filters (sol)
    Apr. 15 (M) Inference Inference for Bayes Nets Bishop § 8.4 (stop at end of 8.4.1, don't read about undirected and factor graphs)
    Apr. 17 (W) MDP, value/policy Iteration Sutton and Barto (Use Ch 1-4 as a reference)
    Apr. 19 (F) T5 Due
    11. MDPs, RL (sol)
    Apr. 22 (M) Reinforcement Learning RL Sutton and Barto (Use Ch 1-6 as a reference) P4 RL released
    Apr. 24 (W) Deep RL
    Apr. 26 (F)
    Midterm 2 Review
    Apr. 29 (M) Interpretability or Learning Theory
    May. 1 (W) Midterm 2
    May. 3 (F)
    May 6 (M) P4 Due


    Camelot.ai is an online platform for machine learning problem-solving and skills development built by three former CS181 students. We focus on helping you learn real, practical skills and using our data to match your talent with top quantitative/technical companies. CS181 uses Camelot.ai for the practicals.

    Machine learning & data science are only getting bigger, and it’s not enough to just be able to code up a simple BFS algorithm in your next tech interview. Our problems have varying difficulties and are designed to mimic real, fun projects, with multiple steps building on one other.

    Enter the Arena to register for one of our weekly tournaments — done right in the browser — with prizes for everyone who finishes (like American Apparel Tshirts!), and additional rewards for top winners. Students who consistently do well will be referred to our sponsoring firms. Itching to get started? Check out the Vault to try your hand at some past tournament problems!

    How are you different from Kaggle?
    First, our problems are designed to be more engaging and interactive, in contrast to simply generating predictions on a giant data set. We tie problems to specific techniques or technologies. Second, Kaggle is not very accessible. Very few people outside of the top 1000 data scientists in the world (+ lots of time and resources) have a shot at winning a Kaggle competition. Our is aim is for users to solve fun, challenging problems and learn at the same time.

    Can we give feedback on the product?
    Yes definitely! Shoot a message to hello.camelot@gmail.com or talk to us directly through the chat box in the portal.