Decision tree. A decision tree is pruned to get (perhaps) a tree that generalize better to independent test data. (We may get a decision tree that might perform worse on the training data but generalization is the goal).
See Information gain and Overfitting for an example. Sometimes simplifying a decision tree gives better results. Jul 04, Machine Learning: Pruning Decision Trees. Classification and regression trees (CART) CART is one of the most well-established machine learning techniques. In non-technical terms, CART Pruning or post-pruning.
Early stopping or pre-pruning. Another portion of abalone?
Categories: Decision trees Artificial intelligence Machine learning.
Number of leaves. Estimated Reading Time: 7 mins. The data is divided into 3 subsets for: training a complete tree; pruning; testing the final model.
You train a complete tree using the subset (1) and apply the tree on the subset (2) to calculate the accuracy. Then prune the tree based on a node and apply that on the subset (2) to calculate another accuracy.