litanlitudan / skyagi

SkyAGI: Emerging human-behavior simulation capability in LLM
https://skyagi.ai
Apache License 2.0
752 stars 53 forks source link

how to use open-source models? #52

Open basicmi opened 1 year ago

basicmi commented 1 year ago

Please add examples using local open-source models, like llama or chatGLM. Thanks

yuyuan223 commented 1 year ago

thx for the advice. Just try to understand better about you needs. You need local open-source models because of the concern about gpt cost? or from the privacy perspective? or other reasons?

litanlitudan commented 1 year ago

I think we should add this, it's an easy win and open doors for many efforts @yuyuan223 @basicmi

basicmi commented 1 year ago

thx for the advice. Just try to understand better about you needs. You need local open-source models because of the concern about gpt cost? or from the privacy perspective? or other reasons?

more than that, I think we can play with different models in the same game and compare them.

renmengjie7 commented 1 year ago

I saw in the video the ability to select open source models, but when I ran skyagi I found that it was not possible to choose. Is it not supported yet?

gaocegege commented 1 year ago

We are working on the local/cloud-hosted open-source LLM support, with the help of https://github.com/tensorchord/modelz-llm.

It should be available next week.

gaocegege commented 1 year ago

I successfully run skyAGI with lmsys/fastchat-t5-3b-v1.0 on my PC, without the GPU:

# Terminal 1
modelz-llm -m lmsys/fastchat-t5-3b-v1.0 --device cpu
# Terminal 2
export OPENAI_API_BASE=http://localhost:8000
export OPENAI_API_KEY="anystring"
skyagi

I have some workarounds to

gaocegege commented 1 year ago

I will write a doc for it.

j2l commented 1 year ago

Thank you @gaocegege for writing a doc, because so far I can't make it work:

pip install modelz-llm transformers sentencepiece accelerate
modelz-llm -m lmsys/fastchat-t5-3b-v1.0 --device cpu

works, but skyagi doesn't add fastchat to its list.