crewAIInc / crewAI-examples

2.63k stars 977 forks source link

local llms without using Ollama -with vllm or huggingface #83

Closed rajeshkochi444 closed 1 week ago

rajeshkochi444 commented 5 months ago

Hi,

Can we use local llms through vllm or huggingface without using ollama?

Thanks Rajesh

FBR65 commented 3 months ago

I'm also interessted in vllm connections.

github-actions[bot] commented 1 week ago

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] commented 1 week ago

This issue was closed because it has been stale for 5 days with no activity.