Closed Faolain closed 6 months ago
Thanks for asking, this is an interesting usecase. I believe you can use OpenAI style prompts(and just use the original llama3 conv template) via:
user_prompt=f"""
Generate a SQL query to answer this question: `{user_question}`
{instructions}
DDL statements:
{create_table_statements}
"""
assistant_prompt=f"""
The following SQL query best answers the question `{user_question}`:
sql
"""
engine.chat.completions.create(
messages=[
{"role": "system", content: ""},
{"role": "user", content: user_prompt},
{"role": "assistant", content: assistant_prpmpt}
]
)
"""
We do need to confirm our chat template support partial assistant prompt hint(which we may not yet but would not be hard to add). After that is addedm this should work
Thanks @tqchen really appreciate it! Will await confirmation regarding partial assistant prompt 🙏
Edit: It appears to work as is :)
thanks @Faolain for confirming, glad it works
❓ General Questions
I have converted https://huggingface.co/defog/llama-3-sqlcoder-8b (based on llama3 as the name indicates) following the instructions here https://llm.mlc.ai/docs/deploy/javascript.html#bring-your-own-model-variant however I am at the step for generating the MLC Chat Config and the system prompt is more involved than the default llama3 prompt as it requires 3 inputs to be passed by the user as indicated here.
My question is, what is the best way to create a new conversational template based on the ones here https://github.com/mlc-ai/mlc-llm/blob/main/python/mlc_llm/conversation_template.py which allows you to pass more than one user input (questions, additional optional instructions, and the create_table_statements)? Additionally how would one call with these additional user provided inputs within web-llm?