turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.59k stars 278 forks source link

[REQUEST] create the output directory during the quantization process #653

Closed Nexesenex closed 6 days ago

Nexesenex commented 1 week ago

Problem

The output directory is not created automatically on Windows 11, and it crashes the quantization process.

Ex :

Q:\GitHub\exllamav2.fks>convert.py -i E:\text-generation-webui\models\mlx-community_Mistral-Large-Instruct-2407-bf16 -o D:\text-generation-webui\models\Mistral-Large-Instruct-2407-EXO2 -cf X:\text-generation-webui\models\Mistral-Large-Instruct-2407-3.9bpw-h6-exl2 -m Q:\measurements\Mistral_Large_DbMe_measurements.json -b 3.9 -hb 6

Error: Directory not found: D:\text-generation-webui\models\Mistral-Large-Instruct-2407-EXO2

Solution

Unless I made a mistake, could it be created by the script instead please?

Alternatives

No response

Explanation

Simple convenience, to not create it manually because I'm using several ones.

Examples

No response

Additional context

No response

Acknowledgements

iamwavecut commented 1 week ago

I encounter this a lot, too, so proposed this little PR to resolve this issue.

turboderp commented 6 days ago

Merged into dev now. :+1: