With create_inference_endpoint, it is possible to set a custom docker image to run in the Inference Endpoint (typically a TGI container). When adding this in https://github.com/huggingface/huggingface_hub/issues/1861, I forgot to add it to update_inference_endpoint. This PR fixes this for consistency. And also to save poor @datavistics regularly making bad dreams about it.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
Fix https://github.com/huggingface/huggingface_hub/issues/2301.
With
create_inference_endpoint
, it is possible to set a custom docker image to run in the Inference Endpoint (typically a TGI container). When adding this in https://github.com/huggingface/huggingface_hub/issues/1861, I forgot to add it toupdate_inference_endpoint
. This PR fixes this for consistency. And also to save poor @datavistics regularly making bad dreams about it.