Decision Trees are a supervised learning method that uses a tree-like structure to make predictions. Each internal node represents a test on an attribute, each branch represents an outcome of the test, and each leaf node represents a class label (decision taken after computing all attributes) or a continuous value.
Decision Trees are a fundamental supervised learning algorithm used for both classification and regression tasks. They partition data recursively based on feature values, creating a tree-like structure of decisions. In the landscape of predictive modeling, they serve as a foundational technique, often forming the basis for more complex ensemble methods like Random Forests.
| Alternative | Difference | Papers (with Decision Trees) | Avg viability |
|---|---|---|---|
| Random Forests | — | 1 | — |
| Semantic Embeddings | — | 1 | — |