Soulter / hugging-chat-api

HuggingChat Python API🤗
GNU Affero General Public License v3.0
865 stars 124 forks source link

how to chose llama3 70B in model, index 1 or 2 only CohereForAI/c4ai-command-r-plus #204

Closed vuemquan closed 4 months ago

vuemquan commented 6 months ago

list model i get is { "models": [ { "datasetName": null, "datasetUrl": null, "description": "Command R+ is Cohere's latest LLM and is the first open weight model to beat GPT4 in the Chatbot Arena!", "displayName": "CohereForAI/c4ai-command-r-plus", "id": "CohereForAI/c4ai-command-r-plus", "modelUrl": "https://huggingface.co/CohereForAI/c4ai-command-r-plus", "name": "CohereForAI/c4ai-command-r-plus", "parameters": { "max_new_tokens": 4096, "stop": [ 91 ], "stop_sequences": [ 91 ], "temperature": 0.3, "truncate": 28672 }, "preprompt": "", "promptExamples": [ { "prompt": "As a restaurant owner, write a professional email to the supplier to get these products every week: \n\n- Wine (x10)\n- Eggs (x24)\n- Bread (x12)", "title": "Write an email from bullet list" }, { "prompt": "Code a basic snake game in python, give explanations for each step.", "title": "Code a snake game" }, { "prompt": "How do I make a delicious lemon cheesecake?", "title": "Assist in a task" } ], "websiteUrl": "https://docs.cohere.com/docs/command-r-plus" }, { "datasetName": null, "datasetUrl": null, "description": "Generation over generation, Meta Llama 3 demonstrates state-of-the-art performance on a wide range of industry benchmarks and offers new capabilities, including improved reasoning.", "displayName": "meta-llama/Meta-Llama-3-70B-Instruct", "id": "meta-llama/Meta-Llama-3-70B-Instruct", "modelUrl": "https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct", "name": "meta-llama/Meta-Llama-3-70B-Instruct", "parameters": { "max_new_tokens": 2047, "stop": [ 106 ], "stop_sequences": [ 106 ], "truncate": 6144 }, "preprompt": "", "promptExamples": [ { "prompt": "As a restaurant owner, write a professional email to the supplier to get these products every week: \n\n- Wine (x10)\n- Eggs (x24)\n- Bread (x12)", "title": "Write an email from bullet list" }, { "prompt": "Code a basic snake game in python, give explanations for each step.", "title": "Code a snake game" }, { "prompt": "How do I make a delicious lemon cheesecake?", "title": "Assist in a task" } ], "websiteUrl": "https://llama.meta.com/llama3/" }, { "datasetName": null, "datasetUrl": null, "description": "Zephyr 141B-A35B is a fine-tuned version of Mistral 8x22B, trained using ORPO, a novel alignment algorithm.", "displayName": "HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1", "id": "HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1", "modelUrl": "https://huggingface.co/HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1", "name": "HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1", "parameters": { "max_new_tokens": 8192, "stop_sequences": null, "truncate": 24576 }, "preprompt": "You are Zephyr, an assistant developed by KAIST AI, Argilla, and Hugging Face. You should give concise responses to very simple questions, but provide thorough responses to more complex and open-ended questions. You are happy to help with writing, analysis, question answering, math, coding, and all sorts of other tasks.", "promptExamples": [ { "prompt": "Write a poem to help me remember the first 10 elements on the periodic table, giving each element its own line.", "title": "Write a poem" }, { "prompt": "Code a basic snake game in python, give explanations for each step.", "title": "Code a snake game" }, { "prompt": "How do I make a delicious lemon cheesecake?", "title": "Assist in a task" } ], "websiteUrl": "https://huggingface.co/HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1" }, { "datasetName": null, "datasetUrl": null, "description": "The latest MoE model from Mistral AI! 8x7B and outperforms Llama 2 70B in most benchmarks.", "displayName": "mistralai/Mixtral-8x7B-Instruct-v0.1", "id": "mistralai/Mixtral-8x7B-Instruct-v0.1", "modelUrl": "https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1", "name": "mistralai/Mixtral-8x7B-Instruct-v0.1", "parameters": { "max_new_tokens": 8192, "repetition_penalty": 1.2, "stop": [ 134 ], "stop_sequences": [ 134 ], "temperature": 0.6, "top_k": 50, "top_p": 0.95, "truncate": 24576 }, "preprompt": "", "promptExamples": [ { "prompt": "As a restaurant owner, write a professional email to the supplier to get these products every week: \n\n- Wine (x10)\n- Eggs (x24)\n- Bread (x12)", "title": "Write an email from bullet list" }, { "prompt": "Code a basic snake game in python, give explanations for each step.", "title": "Code a snake game" }, { "prompt": "How do I make a delicious lemon cheesecake?", "title": "Assist in a task" } ], "websiteUrl": "https://mistral.ai/news/mixtral-of-experts/" }, { "datasetName": null, "datasetUrl": null, "description": "Nous Hermes 2 Mixtral 8x7B DPO is the new flagship Nous Research model trained over the Mixtral 8x7B MoE LLM.", "displayName": "NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO", "id": "NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO", "modelUrl": "https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO", "name": "NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO", "parameters": { "max_new_tokens": 2048, "repetition_penalty": 1, "stop": [ 152 ], "stop_sequences": [ 152 ], "temperature": 0.7, "top_k": 50, "top_p": 0.95, "truncate": 24576 }, "preprompt": "", "promptExamples": [ { "prompt": "As a restaurant owner, write a professional email to the supplier to get these products every week: \n\n- Wine (x10)\n- Eggs (x24)\n- Bread (x12)", "title": "Write an email from bullet list" }, { "prompt": "Code a basic snake game in python, give explanations for each step.", "title": "Code a snake game" }, { "prompt": "How do I make a delicious lemon cheesecake?", "title": "Assist in a task" } ], "websiteUrl": "https://nousresearch.com/" }, { "datasetName": null, "datasetUrl": null, "description": "Gemma 7B 1.1 is the latest release in the Gemma family of lightweight models built by Google, trained using a novel RLHF method.", "displayName": "google/gemma-1.1-7b-it", "id": "google/gemma-1.1-7b-it", "modelUrl": "https://huggingface.co/google/gemma-1.1-7b-it", "name": "google/gemma-1.1-7b-it", "parameters": { "do_sample": true, "max_new_tokens": 1024, "stop": [ 168 ], "stop_sequences": [ 168 ], "truncate": 7168 }, "preprompt": "", "promptExamples": [ { "prompt": "As a restaurant owner, write a professional email to the supplier to get these products every week: \n\n- Wine (x10)\n- Eggs (x24)\n- Bread (x12)", "title": "Write an email from bullet list" }, { "prompt": "Code a basic snake game in python, give explanations for each step.", "title": "Code a snake game" }, { "prompt": "How do I make a delicious lemon cheesecake?", "title": "Assist in a task" } ], "websiteUrl": "https://blog.google/technology/developers/gemma-open-models/" }, { "datasetName": null, "datasetUrl": null, "description": "Mistral 7B is a new Apache 2.0 model, released by Mistral AI that outperforms Llama2 13B in benchmarks.", "displayName": "mistralai/Mistral-7B-Instruct-v0.2", "id": "mistralai/Mistral-7B-Instruct-v0.2", "modelUrl": "https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2", "name": "mistralai/Mistral-7B-Instruct-v0.2", "parameters": { "max_new_tokens": 1024, "repetition_penalty": 1.2, "stop": [ 134 ], "stop_sequences": [ 134 ], "temperature": 0.3, "top_k": 50, "top_p": 0.95, "truncate": 3072 }, "preprompt": "", "promptExamples": [ { "prompt": "As a restaurant owner, write a professional email to the supplier to get these products every week: \n\n- Wine (x10)\n- Eggs (x24)\n- Bread (x12)", "title": "Write an email from bullet list" }, { "prompt": "Code a basic snake game in python, give explanations for each step.", "title": "Code a snake game" }, { "prompt": "How do I make a delicious lemon cheesecake?", "title": "Assist in a task" } ], "websiteUrl": "https://mistral.ai/news/announcing-mistral-7b/" }, { "datasetName": null, "datasetUrl": null, "description": "Phi-3 Mini-4K-Instruct is a 3.8B parameters, lightweight, state-of-the-art open model built upon datasets used for Phi-2.", "displayName": "microsoft/Phi-3-mini-4k-instruct", "id": "microsoft/Phi-3-mini-4k-instruct", "modelUrl": "https://huggingface.co/microsoft/Phi-3-mini-4k-instruct", "name": "microsoft/Phi-3-mini-4k-instruct", "parameters": { "max_new_tokens": 1024, "stop": [ 194, 195, 196 ], "stop_sequences": [ 194, 195, 196 ], "truncate": 3071 }, "preprompt": "", "promptExamples": [ { "prompt": "As a restaurant owner, write a professional email to the supplier to get these products every week: \n\n- Wine (x10)\n- Eggs (x24)\n- Bread (x12)", "title": "Write an email from bullet list" }, { "prompt": "Code a basic snake game in python, give explanations for each step.", "title": "Code a snake game" }, { "prompt": "How do I make a delicious lemon cheesecake?", "title": "Assist in a task" } ], "websiteUrl": "https://azure.microsoft.com/en-us/blog/introducing-phi-3-redefining-whats-possible-with-slms/" } ] }

Soulter commented 6 months ago

Hello, currently index 1 is Llama 3, if you want to use this model in your conversation, use the following code:

chatbot.new_conversation(1, switch_to=True)

I tried and it works well.

image
0. CohereForAI/c4ai-command-r-plus
1. meta-llama/Meta-Llama-3-70B-Instruct
2. HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1
3. mistralai/Mixtral-8x7B-Instruct-v0.1
4. NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
5. google/gemma-1.1-7b-it
6. mistralai/Mistral-7B-Instruct-v0.2
7. microsoft/Phi-3-mini-4k-instruct
raffieeey commented 6 months ago

Hi @Soulter thank you for the solution, it works. But can it only be switching instead of creating a new conversation? Sometimes it creates multiple new chat

github-actions[bot] commented 4 months ago

This issue was marked as stale because of inactivity.

github-actions[bot] commented 4 months ago

This issue was closed because of inactivity.