Nixtla / mlforecast

Scalable machine 🤖 learning for time series forecasting.
https://nixtlaverse.nixtla.io/mlforecast
Apache License 2.0
841 stars 80 forks source link

Use XGBoost DMatrix to train and predict #219

Closed sankara-nps closed 10 months ago

sankara-nps commented 11 months ago

Description

It will be nice to have support for pass objects as dmatrix for XGBoost models which potentially help in improving the xgbioost model predictions

Use case

import xgboost as xgb

# Assuming you have your features and target variable ready (X and y)
# Combine them into a DMatrix
dtrain = xgb.DMatrix(X, label=y)

# Define XGBoost parameters with Pseudo-Huber loss
params = {
    'objective': 'reg:pseudohubererror',  # Specify 'reg:pseudohubererror' for Pseudo-Huber loss
    'alpha': 1.2,  # Adjust this parameter to control the Pseudo-Huber delta
    'eta': 0.1,  # Learning rate
    'max_depth': 6,  # Maximum depth of the tree
    'subsample': 0.8,  # Fraction of data used for building trees
    'colsample_bytree': 0.8,  # Fraction of features used for building trees
    'eval_metric': 'rmse'  # Evaluation metric (can be 'rmse' or other appropriate metrics)
}

# Train the XGBoost model using the entire dataset
num_round = 100  # Number of boosting rounds (you can adjust this)
model = xgb.train(params, dtrain, num_round)

# Make predictions on the same data used for training (since you have no separate test set)
predictions = model.predict(dtrain)

# predictions now contain the predicted values for your entire dataset

I want pass this to XGBoost

jmoralez commented 11 months ago

Hey @sankara-nps, thanks for using mlforecast. Can you provide more details on what you mean with the following:

potentially help in improving the xgbioost model predictions

github-actions[bot] commented 10 months ago

This issue has been automatically closed because it has been awaiting a response for too long. When you have time to to work with the maintainers to resolve this issue, please post a new comment and it will be re-opened. If the issue has been locked for editing by the time you return to it, please open a new issue and reference this one.