Open JochenGuckSnk opened 1 year ago
Hi! Are there any news or plans on adding opportunity to pass such transformers config parameters as 'max_length' using Seldon HuggingFace server?
There're plans but we haven't got around it yet. We'll provide an update on this issue once it's done.
Alternatively, contributions are always welcome :wink:
@adriangonz Do you have any updates? Or approximate dates?
Hey @coctel99, I'm no longer working for Seldon and no longer contributing to MLServer so I don't have any extra context on this one.
@sakoush has taken the reins of the project, so it'd be best to check with him.
@ramonpzg for reference
Hi @coctel99 -- I am taking care of this one for the next release :)
If you are in the community, I'll send you a message once it is merged.
I am interested in using MLServer with HuggingFace models. To get the full potential of these models, it is essential to modify the model parameters (https://huggingface.co/docs/transformers/main_classes/text_generation). I have tried to change these parameters, but it is not straightforward to me. Therefore, it might be a good idea to extend the documentation with a section on how to modify parameters like the temperature or the num_return_sequences.