The Hugging Face PyTorch DLCs for Inference support a wide variety of tasks and so on, both the I/O payloads differ depending on which HF_TASK value is set. So on, the README.md should be extended at least to mention the available environment variables, and add some pointers to the I/O payloads for those.
Additionally, within the hf.co/docs/google-cloud (once published) we could add a section on how to run inference on all the supported scenarios, as right now the documentation on that is limited to the existing examples which may not cover every single use case.
Description
The Hugging Face PyTorch DLCs for Inference support a wide variety of tasks and so on, both the I/O payloads differ depending on which
HF_TASK
value is set. So on, theREADME.md
should be extended at least to mention the available environment variables, and add some pointers to the I/O payloads for those.Additionally, within the hf.co/docs/google-cloud (once published) we could add a section on how to run inference on all the supported scenarios, as right now the documentation on that is limited to the existing examples which may not cover every single use case.