Skip to content

Latest commit

 

History

History
14 lines (14 loc) · 951 Bytes

File metadata and controls

14 lines (14 loc) · 951 Bytes

MachineLearning_Part1

Objective and Tasks

  1. Decision Tree (30 pts): Implement functions for impurity (Gini or entropy), splitting logic, tree-building up to a specified maximum depth, and prediction functionality.
  2. Interpretation (10 pts): Identify and discuss the top three predictors for high sales. Provide your discussion in 3–5 sentences.
  3. Pruning (20 pts): Implement either pre-pruning (e.g., minimum samples per split, maximum depth) or post-pruning (e.g., reduced-error pruning). Compare the performance of the tree before and after pruning.
  4. Random Forest (30 pts): Implement a random forest using bagging of decision trees. Evaluate the model using accuracy, precision, recall, and F1-score. Discuss how the random forest performs upon a single decision tree.
  5. Comparison (10 pts): Compare three models, the baseline decision tree, the pruned tree, and the Random Forest in terms of performance metrics and interpretability.