Closed Starsa closed 3 years ago
ADA Boost vs. Gradient boost.
AdaBoost is the first designed boosting algorithm with a particular loss function. Gradient Boosting is a generic algorithm that assists in searching the approximate solutions to the model.
Gradient Boost is more flexible than AdaBoost.
Gradient
The gradient of a function is a vector containing all its partial derivatives at some point (repeat I know, just trying to focus on the wording)
Adjusted R2
Agglomerative Clustering
AIC
This is great! Good fodder for interview questions. AUC is one I typically poke at. R squared and adjusted R squared as well
Discuss flashcards reviewed and list any questions.