Closed jmaspons closed 1 year ago
Thanks, how about passing parameters to the loss_function
as well? Would it interfere?
Thanks, how about passing parameters to the
loss_function
as well? Would it interfere?
I think it's not a problem while there are no matching parameter names. I'll add the dots to loss_function in this PR if you agree
From ?DALEX::loss_default I don't see a use case for ...
in loss_function
but here it is. We can revert the last commit before merging if needed
Sure, I meant if someone passes custom loss_function
.
It would be useful to have a unit test for this custom predict/loss function with ...
.
I plan to merge this PR over the weekend.
I've been testing the changes and it doesn't work. It needs something like https://community.rstudio.com/t/dots-vs-arg-lists-for-function-forwarding/4995/2 to pass dots
to 2 functions
Reveret last commit to also pass ...
to loss_function. Can be implemented in another PR following https://community.rstudio.com/t/dots-vs-arg-lists-for-function-forwarding/4995/2
@hbaniecki shall we merge this one?
Use case: speed up predictions with keras by passing a larger batch_size parameter (32 by default)