FMInference / FlexLLMGen

Running large language models on a single GPU for throughput-oriented scenarios.
Apache License 2.0
9.21k stars 549 forks source link

Error while split the model name #133

Open neomi-tenenbaum-huawei opened 8 months ago

neomi-tenenbaum-huawei commented 8 months ago

on file: flexgen/opt_config.py on func get_opt_config (~line 54) name = name.split("/")[1] if the model name is "foder_name/folder_name1/opt-125m" the code will return "folder_name1" you have to change the code to: name = name.split("/")[-1]