CS 4900/5900: Machine Learning
Time and Location: Tue, Thu 10:30 – 11:50am, ARC 315
Instructor: Razvan Bunescu
Office: Stocker 341
Office Hours: Tue, Thu 12:00 – 12:30pm, or by email appointment
Email: bunescu @ ohio edu
Grader: Hui Shen (hs138609 @ ohio edu)
Office: Stocker 371A
Office Hours: Mon, Wed 1:30 – 2:30pm, or by email appointment
There is no required textbook for this class. Slides and supplementary materials will be made available on the course website.
Reinforcement Learning: An Introduction by Richard S. Sutton and Andrew G. Barto. MIT Press, 2018
Machine Learning: The Art and Science of Algorithms that Make Sense of Data by Peter Flach, Cambridge University Press, 2012
A Course in Machine Learning by Hal Daume III
Machine Learning by Tom Mitchell. McGraw Hill, 1997
Pattern Recognition and Machine Learning by Christopher Bishop. Springer, 2007.
Pattern Classification by Richard O. Duda, Peter E. Hart, & David G. Stork. Wiley-Interscience, 2001
The Elements of Statistical Learning: Data Mining, Inference, and Prediction by T. Hastie, R. Tibshirani, & J. H. Friedman. Springer Verlag, 2009
This course will give an overview of the main concepts, techniques, and algorithms underlying the theory and practice of machine learning. The course will cover the fundamental topics of classification, regression and clustering, and a number of corresponding learning models such as perceptrons, logistic regression, linear regression, Naive Bayes, nearest neighbors, and Support Vector Machines. The description of the formal properties of the algorithms will be supplemented with motivating applications in a wide range of areas including natural language processing, computer vision, bioinformatics, and music analysis.
The students are expected to be comfortable with programming and familiar with basic concepts in linear algebra and statistics. Relevant background material in linear algebra, probability theory and information theory will be made available during the course.
- Syllabus & Introduction
- Linear Regression and L2 Regularization
- Linear algebra and optimization in Python
- Gradient Descent Algorithms
- Logistic Regression, Maximum Likelihood, and Maximum Entropy
- Fisher Linear Discriminant
- Perceptrons and Kernels
- Support Vector Machines
- An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, Nello Cristianini and John Shawe-Taylor [Available online through library.ohiou.edu]
- Support Vector Machines [Trends and Controversies] , Marti Hearst, Susan Dumais, Edgar Osuna, John Platt, Bernhard Scholkopf, IEEE Intelligent Systems, 13(4), 1998
- A Tutorial on Support Vector Machines for Pattern Recognition, Christopher J. C. Burges, Data Mining and Knowledge Discovery 1998
- Nearest Neighbor Methods
- Naive Bayes
- Reinforcement Learning
The spam classification portion of assignment 6 is partly based on a homework assignment developed by Andrew Ng.
Other online reading materials:
Machine learning software:
- scikit-learn Machine Learning in Python
- Weka Data Mining Software in Java
- SVMlight Implementation of SVMs in C
- LIBSVM Implementation of SVMs in C++ and Java
- MALLET Java implementations of logistic regression, HMMs, linear chain CRFs, and other ML models.
- LibSVM applet demonstrating SVMs.
- k-Nearest Neighbor short animated video, by Antal van den Bosch