The RandomForestClassifier is trained using bootstrap aggregation, where each new tree is fit from a bootstrap sample of the training observations . The out-of-bag (OOB) error is the average error for each calculated using predictions from the trees that do not contain in their respective bootstrap sample. This allows the RandomForestClassifier to be fit and validated whilst being trained.
In this video, I example how you end up training only 64% of your training data when OOB option is set true in Random Forest.
To view the video
- Click here
- Click on the image below