Open phuochungtr opened 2 weeks ago
And also the Selector is missing field "inputs" in payload. Please help us to verify the method Selector
How did you deploy your LLM ? vllm on sagemaker?
I got model artifact from hugging face and create sagemaker endpoint using that artifact
https://github.com/aws-samples/llm_model_hub is the offical way to deploy llm which is compatible with dify sagemaker llm provider
https://github.com/aws-samples/llm_model_hub is the offical way to deploy llm which is compatible with dify sagemaker llm provider
@phuochungtr I met the same issues before, this might be caused by old version of docker image. You can try the method @ybalbert001 provided above. Otherwise, you can build your own docker image, and edit dockerfile in your dify project.
When I tried to use our LLM model in basic chat bot, and "talk to bot", it says << LLMResult model input should be a valid string( (input_value=None, input_type=None) >> regardless model is selected.
Can you tell us the way you deploy dify?
When I tried to use our LLM model in basic chat bot, and "talk to bot", it says << LLMResult model input should be a valid string( (input_value=None, input_type=None) >> regardless model is selected.