COSC-575: Machine Learning

Project 5
Spring 2017

Due: Mon, May 1 @ 11:59 P.M.
10 points

  1. Modify your implementation of DT and related classes from p3 so DT learns from weighted examples.

  2. Implement Flach's Algorithm 11.1 as the executable Bagging. Bag DT. Don't post-prune. By default, Bagging should produce 10 classifiers. Predict by averaging.

  3. Implement Flach's Algorithm 11.3 as the executable Boosting. Boost DT. Don't post-prune. By default, Boosting should produce 10 classifiers.

The implementations must follow sound principles of object-oriented Implement each learner as a single executable. No windows. No menus. No prompts. Just do it.

The logic of the implementation should be the same as that for the previous implementations. If the user runs a learner and specifies only a training set, then the program should evaluate using 10-fold cross-validation and output the results. Naturally, the user can use the -x switch to change the default. If the user provides the -p switch and a proportion, then the program conducts an evaluation using the hold-out method. Otherwise, if the user specifies both a training and testing set, then the program should build a model from the training set, evaluate it on the testing set, and output the results.

Instructions for Submission

In a file named HONOR, please include the statement:
In accordance with the class policies and Georgetown's Honor Code,
I certify that, with the exceptions of the class resources and those
items noted below, I have neither given nor received any assistance
on this project.
Include this file in your zip file submit.zip.

Submit p5 exactly like you submitted p4.

Plan B

If Autolab is down, upload your zip file to Blackboard.

Copyright © 2019 Mark Maloof. All Rights Reserved. This material may not be published, broadcast, rewritten, or redistributed.