ageron / handson-ml3

A series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.
Apache License 2.0
7.84k stars 3.14k forks source link

[BUG] Chapter 7 (07_ensemble_learning_and_random_forests) in Jupyter Notebook #117

Closed DaveBhatt closed 9 months ago

DaveBhatt commented 10 months ago

Describe the bug

In 07_ensemble_learning_and_random_forests notebook

Exercise Solutions

  1. If your AdaBoost ensemble underfits the training data, you can try increasing the number of estimators or reducing the regularization hyperparameters of the base estimator. You may also try slightly decreasing the learning rate.
  2. If your Gradient Boosting ensemble overfits the training set, you should try increasing the learning rate. You could also use early stopping to find the right number of predictors (you probably have too many).

Expected behavior

To reduce underfitting i think we must try increasing the learning rate (not decreasing) and vice versa. Pls correct me if i'm wrong.

Screenshots

Screenshot 2023-12-26 122931

ageron commented 9 months ago

Thanks for your question. Good catch, that's a mistake, these words should be reversed: decreasing the learning rate means that each estimator gets less weight, so the model is less likely to overfit. I'll fix that now.