mlc-ai / mlc-llm

Universal LLM Deployment Engine with ML Compilation
https://llm.mlc.ai/
Apache License 2.0
17.95k stars 1.43k forks source link

[Question] How to generate conversational template with more than one input #2383

Closed Faolain closed 2 months ago

Faolain commented 2 months ago

❓ General Questions

I have converted https://huggingface.co/defog/llama-3-sqlcoder-8b (based on llama3 as the name indicates) following the instructions here https://llm.mlc.ai/docs/deploy/javascript.html#bring-your-own-model-variant however I am at the step for generating the MLC Chat Config and the system prompt is more involved than the default llama3 prompt as it requires 3 inputs to be passed by the user as indicated here.

<|begin_of_text|><|start_header_id|>user<|end_header_id|>

Generate a SQL query to answer this question: `{user_question}`
{instructions}

DDL statements:
{create_table_statements}<|eot_id|><|start_header_id|>assistant<|end_header_id|>

The following SQL query best answers the question `{user_question}`:
```sql

My question is, what is the best way to create a new conversational template based on the ones here https://github.com/mlc-ai/mlc-llm/blob/main/python/mlc_llm/conversation_template.py which allows you to pass more than one user input (questions, additional optional instructions, and the create_table_statements)? Additionally how would one call with these additional user provided inputs within web-llm?

tqchen commented 2 months ago

Thanks for asking, this is an interesting usecase. I believe you can use OpenAI style prompts(and just use the original llama3 conv template) via:

user_prompt=f"""
Generate a SQL query to answer this question: `{user_question}`
{instructions}

DDL statements:
{create_table_statements}
"""

assistant_prompt=f"""
The following SQL query best answers the question `{user_question}`:
sql
"""

engine.chat.completions.create(
    messages=[
        {"role": "system", content: ""},
        {"role": "user", content: user_prompt},
        {"role": "assistant", content: assistant_prpmpt}
    ]
)
"""

We do need to confirm our chat template support partial assistant prompt hint(which we may not yet but would not be hard to add). After that is addedm this should work

Faolain commented 2 months ago

Thanks @tqchen really appreciate it! Will await confirmation regarding partial assistant prompt 🙏

Edit: It appears to work as is :)

tqchen commented 2 months ago

thanks @Faolain for confirming, glad it works