talhaanwarch / streamlit-llama

Streamlit chatbot with Llama-2-7B-chat
https://chatdemo.talhaanwar.com/
27 stars 13 forks source link

Questions... #1

Closed bbartling closed 2 days ago

bbartling commented 1 year ago

@talhaanwarch nice job on the app! Looks great.

talhaanwarch commented 1 year ago

It is using ctransformers . you can read how to use gpu there. For your particular use case, i would suggest you to see this repo of mine . Its currently based on openai, but i am working on using opensource model. Instead of flask frontend a streamlit frontend can be used.

bbartling commented 1 year ago

Nice work! How do you setup the demo of your app? Is that hosted on GitHub?

Very cool.

On Mon, Aug 7, 2023, 12:38 PM Talha Anwar @.***> wrote:

It is using ctransformers https://github.com/marella/ctransformers . you can read how to use gpu there. For your particular use case, i would suggest you to see this repo of mine https://github.com/talhaanwarch/doc_chat. Its currently based on openai, but i am working on using opensource model. Instead of flask frontend a streamlit frontend can be used.

— Reply to this email directly, view it on GitHub https://github.com/talhaanwarch/streamlit-llama/issues/1#issuecomment-1668313240, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHC4BHP7WUFTX5WON7F6BGLXUERZPANCNFSM6AAAAAA3HAL574 . You are receiving this because you authored the thread.Message ID: @.***>

talhaanwarch commented 1 year ago

its hosted on vps. though the model is hosted locally

bbartling commented 1 year ago

Any chance you could send me a link for that hosting service?

Would you have any advice between using llama-cpp-python or ctramformers for better results?

talhaanwarch commented 1 year ago

the hosting server has no rule, the model is hosting locally. I dont think there will be much difference between both.

bbartling commented 1 year ago

what is VPS where you say its hosted. I cant find that vendor

talhaanwarch commented 1 year ago

llama model is hosted locally on my pc