Closed LumenScopeAI closed 5 months ago
Hi the model_lib has the format mode_type
_quantization
. you can find both these fields inside the mlc-chat-config.json
file inside the model directory. Also you need to have a matching JSON inside model_list
for each model lib key inside model_lib_path_for_prepare_libs
. Please refer to the base app-config.json
file for reference. You can also checkout the docs here
@LumenScopeAI what value should I set for the parameter --conv-template when use Yi-6B-Chat ?
❓ General Questions
I use these json, but meet some problems. So how to set model_lib? The problems I meet is the same as: https://github.com/mlc-ai/mlc-llm/issues/1517