Closed Borda closed 1 month ago
context for the issue:
With batching enabled, we expect the predict method to return a list of predictions. And when user don't implement LitAPI.unbatch
we wrap the output to list(output)
before sending to the endode_response
method.
The list(prediction_output) in this case was a string which got split by character. So, we need to warn the users in this case.
I'd like to fix that. @aniketmaurya
Will raise a PR. Thanks.
@grumpyp looking forward! pls let me know if you have any question
context for the issue:
With batching enabled, we expect the predict method to return a list of predictions. And when user don't implement
LitAPI.unbatch
we wrap the output tolist(output)
before sending to theendode_response
method.The list(prediction_output) in this case was a string which got split by character. So, we need to warn the users in this case.
Would you want to prevent this to happen, or can you think of cases where this is needed - so I'll only add a warning in case output
is a string when batching is enabled? I could additionally introcude a parameter to enforce list-like outputs.
@grumpyp let's just print a warning for now and observe any new issue on this. You can add the logic here.
🐛 Bug
To Reproduce
without batching all works as expected
but with batch, it returns just the first character
Code sample
Expected behavior
Environment
If you published a Studio with your bug report, we can automatically get this information. Otherwise, please describe:
conda
,pip
, source):Additional context