Closed linux-leo closed 1 year ago
Implemented. Wait me to publish a new version.
pip3 install hugchat --upgrade
You can invoke switch_llm(1)
to change model to meta-llama/Llama-2-70b-chat-hf , switch_llm(0)
to OpenAssistant/oasst-sft-6-llama-30b
Details:
switch_llm(self, to: int) -> bool:
'''
Attempts to change current conversation's Large Language Model.
Requires an index to indicate the model you want to switch.
For now, 0 is `OpenAssistant/oasst-sft-6-llama-30b-xor`, 1 is `meta-llama/Llama-2-70b-chat-hf` :)
* llm 1 is the latest LLM.
* REMEMBER: For flexibility, the effect of switch just limited to *current conversation*. You can manually switch llm when you change a conversasion.
'''
Can anyone give an example of how to switch LLM? In Inference Code
Can anyone give an example of how to switch LLM? In Inference Code
Hi! Here are some codes may help you.
from hugchat import hugchat
from hugchat.login import Login
# Log in to huggingface and grant authorization to huggingchat
sign = Login(email, passwd)
cookies = sign.login()
# Save cookies to the local directory
cookie_path_dir = "./cookies_snapshot"
sign.saveCookiesToDir(cookie_path_dir)
# Create a ChatBot
chatbot = hugchat.ChatBot(cookies=cookies.get_dict()) # or cookie_path="usercookies/<email>.json"
print(chatbot.chat("Hello"))
# Create a new conversation
id = chatbot.new_conversation()
chatbot.change_conversation(id)
# switch LLM
chatbot.switch_llm(0) # switch to OpenAssistant/oasst-sft-6-llama-30b-xor
chatbot.switch_llm(1) # switch to meta-llama/Llama-2-70b-chat-hf
Can someone help with providing the mapping of the below models currently available in hugging chat so that i can use switch_llm() to query them?
meta-llama/Llama-2-70b-chat-hf [codellama/CodeLlama-34b-Instruct-hf] [tiiuae/falcon-180B-chat] [mistralai/Mistral-7B-Instruct-v0.2] [openchat/openchat_3.5] [mistralai/Mixtral-8x7B-Instruct-v0.1]
Can someone help with providing the mapping of the below models currently available in hugging chat so that i can use switch_llm() to query them?
meta-llama/Llama-2-70b-chat-hf [codellama/CodeLlama-34b-Instruct-hf] [tiiuae/falcon-180B-chat] [mistralai/Mistral-7B-Instruct-v0.2] [openchat/openchat_3.5] [mistralai/Mixtral-8x7B-Instruct-v0.1]
You are able to use chatbot.get_available_llm_models()
to get a list of available models
I also have the same issue with chatbot.switch_llm(). I wanna change the model from 'meta-llama/Llama-2-70b-chat-hf' to 'mistralai/Mistral-7B-Instruct-v0.2' so that I use chatbot.switch_llm(3) but it still is 'meta-llama/Llama-2-70b-chat-hf'. Is there have any suggest?
I also have the same issue with chatbot.switch_llm(). I wanna change the model from 'meta-llama/Llama-2-70b-chat-hf' to 'mistralai/Mistral-7B-Instruct-v0.2' so that I use chatbot.switch_llm(3) but it still is 'meta-llama/Llama-2-70b-chat-hf'. Is there have any suggest?
Try creating a new conversation after you switch to the Mistral model with chatbot.new_conversation()
. You can also initialize the chatbot with a specific model like this: chatbot = hugchat.ChatBot(cookie_path="...", default_llm=MODEL_INDEX)
Thanks for your response. When I use 'chatbot = hugchat.ChatBot(cookie_path="...", default_llm=3)' or create a new conversation after switch model it also show as below: Is there have any suggest?
I'm not having the same error. Would you please show a copy of your code, if possible?
Thanks for your response. You can see the picture as below:
After some testing it seems that the error comes from the fact that we are trying to create a conversation with the model mistralai/Mistral-7B-Instruct-v0.1
, which doesn't show up on the HuggingChat GUI. I'll create a new issue on this topic (#148)
Thanks for your response. You can see the picture as below:
You should still be able to use other models by changing the default_llm index to something other than 3
Thank you for your help.
The Ability to select between meta-llama/Llama-2-70b-chat-hf and OpenAssistant/oasst-sft-6-llama-30b would be nice.