COSC-575: Machine Learning

Spring 2012

Class Time: MW 5:00–6:15 PM
Classroom: WGR 206
Instructor: Mark Maloof
Office: 325 St. Mary's
Mailbox: St. Mary's
Office Hours: In-person (325 STM): TR 11:00 AM–12:00 PM; online: M 10:30–11:30 AM and W 3:00–4:00 PM; or by appointment. Send me an email to get the Zoom link for online office hours. (or by appointment)

Course Description

This graduate lecture surveys the major research areas of machine learning. Through traditional lectures, programming projects, paper presentations, and research projects, students learn (1) to understand the foundations of machine learning, (2) to comprehend, analyze, and critique papers from the primary literature, (3) to replicate studies described in the primary literature, and (4) to design, conduct, and present their own studies. The course compares and contrasts machine learning with related endeavors, such as statistical learning, pattern classification, data mining, and information retrieval. Topics include Bayesian decision theory, instance-based approaches, Bayesian methods, decision trees, rule induction, density estimation, linear classifiers, neural networks, support vector machines, ensemble methods, learning theory, evaluation, and applications. Time permitting additional topics include genetic algorithms, unsupervised learning, semi-supervised learning, outlier detection, sequence learning, and reinforcement learning.

Primary Text:


  1. Introduction: Definitions, Areas, History, Paradigms
  2. Bayesian Decision Theory
  3. Instance-based learning: k-NN, kd-trees
  4. Bayesian learning: Bayes' Theorem, naive Bayes
  5. Evaluation: Train/Test Methodologies
  6. Density estimation: Parametric, Non-parametric, Bayesian
  7. Decision Trees: ID3, C4.5
  8. Decision Trees: Stumps, VFDT; Midterm Exam
  9. Rule Learning: Ripper, OneR
  10. Ensemble Methods: Bagging, Boosting
  11. Ensemble Methods: Random Forests, Voting, Weighting
  12. Neural Networks: Linear classifiers, Perceptron
  13. Neural Networks: Multilayer networks, Back-propagation
  14. Support Vector Machines: Perceptron, Dual representation
  15. Support Vector Machines: Margins, Kernels, Training, SMO
  16. Outlier Detection: Parametric, Non-parametric, Density-based

Course Policies

Assignments and Grading



Go Home  

Copyright © 2019 Mark Maloof. All Rights Reserved. This material may not be published, broadcast, rewritten, or redistributed.