crewAIInc / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
19.2k stars 2.65k forks source link

Running on Google colab with local models ? #64

Closed timtensor closed 7 months ago

timtensor commented 8 months ago

Hi thanks for the ollama integration for local models . I was wondering if it's possible to use local models in the colab environment? The reason being not everyone has access to good processing computers . If I understand correctly ollama is just downloading a model is it not ? A notebook example would be great.

I think for creative purposes for example there are some good Mistral 7b models .

Thanks in advance .

jbdatascience commented 8 months ago

Hi thanks for the ollama integration for local models . I was wondering if it's possible to use local models in the colab environment? The reason being not everyone has access to good processing computers . If I understand correctly ollama is just downloading a model is it not ? A notebook example would be great.

I think for creative purposes for example there are some good Mistral 7b models .

Thanks in advance .

I tried yesterday a demo example in a free version of Colab and ran into all kinds of problems (Trip advisor example using Open Hermes as LLM with Ollama because this was advised here in the GitHub issues section). There was a point were it seemed to get working, but then it complained about the API endpoint being already in use. Started the whole colab notebook all over, same problem.

There I stopped not having a clue.

Perhaps it will run in a local Jupyter notebook on om local PC, OpenHermes LLM being served by LMSTUDIO? I will try to try it out next week or so!

joaomdmoura commented 7 months ago

I haven't tried running it in google collab, but it sound like an infra/setup problem? You can use LMStudio, crew would support that, it would also support external APIs like huggingface, let me know how that goes, meanwhile I'm closing this, but happy to open if I can help

timtensor commented 7 months ago

Thanks . I unfortunately cannot run LM studio at the moment

Do you by anychance have a guide for non open AI models how they can be used ?

timtensor commented 7 months ago

Hi thanks for the ollama integration for local models . I was wondering if it's possible to use local models in the colab environment? The reason being not everyone has access to good processing computers . If I understand correctly ollama is just downloading a model is it not ? A notebook example would be great.

I think for creative purposes for example there are some good Mistral 7b models .

Thanks in advance .

I tried yesterday a demo example in a free version of Colab and ran into all kinds of problems (Trip advisor example using Open Hermes as LLM with Ollama because this was advised here in the GitHub issues section). There was a point were it seemed to get working, but then it complained about the API endpoint being already in use. Started the whole colab notebook all over, same problem.

There I stopped not having a clue.

Perhaps it will run in a local Jupyter notebook on om local PC, OpenHermes LLM being served by LMSTUDIO? I will try to try it out next week or so!

Hi did you have any luck running it locally ?