EE787: Machine Learning

Announcements

  • No class on 12/10 and 12/12, and make-up sessions on 12/20, 12/24, 12/26, and 12/27, at 10:30am-12:00pm in Electronics and Information Bldg. Rm.539.

  • No class on 10/29.

  • Homework #2 was posted.

  • Homework #1 was posted.

  • Welcome to EE787: Machine learning.

Course Info.

Course descriptions

  • Fundamental concepts and theories in machine learning, supervised and unsupervised learning, regression and classification, loss function selection and its effect on learning, regularization and robustness to outliers, numerical experiments on data from a wide variety of engineering and other disciplines.

Lectures

  • Tue/Thr 13:30-14:45 (Multimedia bldg. Rm.210)

Office hours

  • Tue/Thr 15:00-16:00 (Rm.516), or by appointments if you cannot meet them.

Prerequisites

  • Previous exposure to linear algebra, probability, and programming.

  • Working knowledge on optimization will be a plus.

Reference textbooks

Grading policy

  • No exams.

  • Students will be evaluated by their homework assignments.

Lecture notes

The course material is reproduced from the EE104: Introduction to machine learning by Sanjay Lall and Stephen Boyd at Stanford university, under their kind permission.

  1. Course overview

  2. Supervised learning via empirical risk minimization

  3. Least squares linear regression

  4. Validation

  5. Features

  6. Regularization

  7. House prices example

  8. Non-quadratic losses

  9. Non-quadratic regularizers

  10. Optimization

  11. Prox-gradient method

  12. Boolean classification

  13. Multi-class classification

  14. Neural networks

  15. Unsupervised learning

Assignments

Several sets of occasional homeworks will be assigned. You are encouraged to work in groups, however everyone should turn in his/her own work. Some of these assignments are from Introduction to applied linear algebra - vectors, matrices, and least squares.

  1. Homework #1 (due 10/22)

  2. Homework #2 (due 11/7)

Julia

Julia language

  • We will be using Julia, which excels in high performance technical computing, for homework assignments.

  • You are not expected to have a strong background in programming (with Julia or otherwise), because the program you will write will use only a tiny subset of Julia's (many and powerful) features.

Reference webpages

Files

These are some data and the Julia codes in .ipynb notebook files that we are using for lectures or homework assignments.

  1. Straight line fit in julia (.ipynb)

  2. Straight line fit in python (.ipynb)

  3. Diabetes example (.ipynb)

  4. Splitting dataset (.ipynb)

  5. Polynomial fit (.ipynb)

  6. Reading json files from webpages (.ipynb)

  7. House price example (.ipynb, train.csv, data_description.txt ),

  8. Tomography (.ipynb, line_pixel_length.jl, tomodata_fullysampled.json, tomodata_undersampled.json)

  9. Piecewise-linear fit (.ipynb)

  10. Convex.jl tutorial (.ipynb)

  11. Binary classifiers (.ipynb)

  12. Iris classification (iris.csv, .ipynb)