Open e-eight opened 3 years ago
Hi @e-eight, thanks for reporting! Could you provide me with an example on how to use the LBFGS optimizer, for example:
for batch_idx, (data, target) in enumerate(dataloader):
# Here is your code
According to this introduction, it looks like using the LBFGS optimizer is different from other optimizers.
You can try it this way:
for batch_idx, (data, target) in enumerate(dataloader):
# Code for sampling with replacement for bagging, or the corresponding code
# for other models.
# Optimization
def closure():
if torch.is_grad_enabled():
optimizer.zero_grad()
sampling_output = estimator(*sampling_data)
loss = criterion(sampling_output, sampling_target)
if loss.requires_grad:
loss.backward()
return loss
optimizer.step(closure)
# If you want to calculate the loss for monitoring:
sampling_output = estimator(*sampling_data)
loss = closure() # You can use this however it is preferred.
This way of optimizing should work with both LBFGS and other optimizers, such as Adam, at least it has worked for me with single estimators. You might find more details on the LBFGS optimizer here.
Thanks for your explanation. After reading the introduction, I think there should be no problem on supporting the LBFGS optimizer, wondering that if you are interested in working on this feature request ;-)
Sure, I will be happy to work on it!! I will get started on it then, and comment here if I face any problems.
Glad to hear that 😄. Here are some instructions on what to do next:
CHANGELOG.rst
set_optimizer
method in torchensemble/utils/set_module.py
__set_optimizer_doc
in torchensemble/_constants.py
torchensemble/fusion.py
to see if the LBFGS optimizer and other optimizer both work as expected. After then, we could modify other ensembles similarly.Feel free to ask me anything in this issue or your pull request.
@all-contributors please add @e-eight for code
@xuyxu
I've put up a pull request to add @e-eight! :tada:
@e-eight it would be better to open a PR on your own.
I have written the code, but I am not sure what is the best way to test it. I was thinking about doing the Year Prediction example in the examples
folder, but with the LBFGS optimizer. Do you have any suggestions? Thanks!
Hi @e-eight, I am not sure if I understand your problem correctly. Perhaps you could open a pull request based on your current code, and we can then have a discussion there. For now, there is no need to pass all checks, simply upload your code, so that I can take a look and better understand your problem ;-)
Added pull request #81.
Thanks @e-eight for your PR. Kind of busy in two days. I will get back to you soon.
Hi,
I am trying to use the
BaggingRegressor
model, with shallow estimators, on a small dataset, for which the LBFGS optimizer usually gives good results with a single estimator. However I see that the LBFGS optimizer in PyTorch is not included in the accepted list of optimizers fortorchensemble
. Will it be possible to add the LBFGS optimizer to the accepted list of optimizers, or is there any way that I can use the LBFGS optimizer withtorchensemble
for my work?Thanks