-
## Description
With the same parameters except **data_sample_strategy**, time elapsed for training with parameter `data_sample_strategy = bagging` is 194s while that with parameter `data_sample_strat…
-
I wasn't quite sure whether I understood this sentence correctly:
> Gradient boosting machines were applied with 10-fold cross-validation such that every sample was left out once.
If I understand …
-
Hi there,
I'm working on a regression problem with lightGBM 2.2.3, and I encountered a weird issue:
If I add a constant or multiply a positive constant to all features, the predicted value woul…
-
File "C:\************\.venv\Lib\site-packages\Cython\Build\Dependencies.py", line 1125, in cythonize
result.get(99999) # seconds
^^^^^^^^^^^^^^^^^
File "C:\************\Python\…
-
### Aim
To predict housing prices in Boston using the CatBoost Regression model.
### Details
The project utilizes the CatBoost Regressor, a powerful gradient boosting algorithm, to predict housing …
-
Train various ML models on the divided dataset (70:30 ratio for training and testing) like Polynomial Regression, Neural Networks etc.
Report your observations to me in the form of a presentation …
-
I see
The author introduces a simple alternative to XGBoost
from
https://mltechniques.com/product/ebook-synthetic-data/
may you clarify what do you mean ?
is it
2.2.1 How hidden decision tre…
-
## Description
I'm building a main classifier model (lightgbm), together with an adversary model. I tried to use lightgbm.LGBMRegressor to build tree by tree, so that I can update my main classifier …
-
## Summary
Partial dependencies can be computed directly from the data distribution in the leaves of the trees in the booster, rather than by calling `predict`, see references below. This tends to …
-
Try to use this model to make predictions. Explained better [here](https://docs.google.com/document/d/19UnRkW-gWVjOWSk2-YzcHqdsUt9YW9NfKujpAW1ApMk/edit). Ideally we would be using [this](https://githu…