π CMU 10601 ML Learning Hub
Comprehensive guide to machine learning algorithms from Carnegie Mellon's Introduction to Machine Learning course. Learn the theory, see real-world applications, then experiment with interactive visualizations.
π Select Algorithm to Learn
Decision Tree
Supervised Learningπ What is it?
A tree-like model that makes decisions based on feature thresholds. Each internal node represents a test on an attribute, each branch represents the outcome of the test, and each leaf node represents a class label.
πͺ Grocery Store Example
A grocery store uses decision trees to determine optimal product placement. If customer_age > 45 AND cart_value > $50, place organic products at eye level.
β Strengths
- β’Highly interpretable
- β’Handles both numerical and categorical data
- β’Requires little data preparation
- β’Can model non-linear relationships
β οΈ Weaknesses
- β’Prone to overfitting
- β’Unstable (small data changes can result in different trees)
- β’Biased toward features with more levels
π― Common Use Cases
Mathematical Theory
Decision trees use recursive binary splitting to partition the feature space. The goal is to create homogeneous subsets using measures like Gini impurity, entropy, or information gain. The tree is built top-down using a greedy approach.
O(n log n * m) training, O(log n) prediction
O(n)