aws-samples / dify-aws-tool

MIT No Attribution
32 stars 7 forks source link

Can't load llm model #35

Open phuochungtr opened 2 weeks ago

phuochungtr commented 2 weeks ago

When I tried to use our LLM model in basic chat bot, and "talk to bot", it says << LLMResult model input should be a valid string( (input_value=None, input_type=None) >> regardless model is selected.

phuochungtr commented 2 weeks ago

And also the Selector is missing field "inputs" in payload. Please help us to verify the method Selector

ybalbert001 commented 2 weeks ago

How did you deploy your LLM ? vllm on sagemaker?

phuochungtr commented 2 weeks ago

I got model artifact from hugging face and create sagemaker endpoint using that artifact

ybalbert001 commented 2 weeks ago

https://github.com/aws-samples/llm_model_hub is the offical way to deploy llm which is compatible with dify sagemaker llm provider

warren830 commented 1 week ago

https://github.com/aws-samples/llm_model_hub is the offical way to deploy llm which is compatible with dify sagemaker llm provider

@phuochungtr I met the same issues before, this might be caused by old version of docker image. You can try the method @ybalbert001 provided above. Otherwise, you can build your own docker image, and edit dockerfile in your dify project.

warren830 commented 1 week ago

When I tried to use our LLM model in basic chat bot, and "talk to bot", it says << LLMResult model input should be a valid string( (input_value=None, input_type=None) >> regardless model is selected.

Can you tell us the way you deploy dify?