Open kdsull opened 1 year ago
Hi @kdsull, You are welcome, glad you found the project useful.
Thank you for reporting this issue, I didn't know Windows does not support color codes! I wanted the package to be dependency free, that's why I didn't want to use another package for coloring.
To fix this just go to this line and remove the colors. Or you can just use WSL on windows, it should work I guess.
If you find any issues doing that, maybe I will add another option to the cli for you, to prevent using colors ?
Aha.. Yes, WSL seems to be the right answer. I need to use docker anyways, so installed WSL2, and it works fine. Tricky part was how to find model files on Windows file system, but '/mnt/e/models/Wizard~' did the trick, and allocating more memory to wsl by making a '.wslconfig' file with 'memory=26GB' option. Out of my curiosity, you heard about 'ggcc' for Falcon 40B model, and any idea how to apply this ? I'm addicted to pyllamacpp, and want no other implementations... I mean, it's so easy, clean and robust !! Many thanks and Cheers !
I only use Linux, so I don't know the details of how things work in Windows to be honest :sweat_smile: .. but 'Glad it worked, WSL should be a Linux machine at the end of the day.
About 'ggcc', are you talking about this ?
I didn't know about it until now. it seems really interesting as llama.cpp
does not support Flacon yet afaik.
I wish I could integrate it in pyllamacpp
, but I think I might run into several conflicts as both llama.cpp
and gllm.cpp
depend on a modified version of ggml
.
But I will look into it when I have some time.
Many Thanks to you as well :), @kdsull
Yup, looks like 'ggllm.cpp' is the one. I heard about it from TheBloke card on huggingface, whose huge ggml works make me busy running on your pyllamacpp ! I wish you could find some time to implement it, coz this Falcon stuff seems pretty hot recently. In the meantime, I'll have to try running ggllm.cpp, and study your code. Many many domos again !
@kdsull, as you are really interested in using it, I got some time and decided to implement it for you. Hope it will be useful.
Luckily it is a fork of llama.cpp
, so I made a unified backend around both models, and you can run ggcc
models as well as other ggml
models with the same old pyllmacpp API (I tried it with falcon and Wizard-Vicuna from TheBloke and they both work)
You can find the implementation in the ggllm branch.
clone and install or just go with pip install git+..
(if you are on windows don't forget to use WSL :))
give it a try and let me know how it goes ?
Hi, many thanks for great work. pyllamacpp is my favorite llama.cpp binding, and I love using it on my Mac. But on Windows, I see that 'color code' doesn't seem to be working, like "You: What's the point of Zen Buddhism ? [94mAI: [0m[96mZ[0m[96men[0m[96m Buddh[0m[96mism[0m[96m is[0m[96m a[0m[96m branch[0m[96m of[0m[96m Mah[0m[96may[0m[96mana[0m[96m Buddh[0m[96mism[0m[96m that[0m[96m emphas[0m[96mizes[0m[96m the[0m[96m att[0m[96main[0m[96mment[0m[96m of[0m[96m en[0m[96mlight[0m[96men[0m[96mment[0m[96m through[0m[96m med[0m[96mitation[0m[96m and[0m[96m the[0m[96m experience[0m[96m of[0m[96m moment[0m[96m-[0m[96mto[0m[96m-[0m[96mm[0m[96moment[0m[96m aw[0m[96maren[0m[96mess[0m[96m.[0m[96m The[0m[96m ult[0m[96mimate[0m[96m goal[0m[96m of[0m[96m Z[0m[96men[0m[96m Buddh[0m[96mism[0m[96m is[0m[96m to[0m[96m realize[0m[96m one[0m[96m'[0m[96ms[0m[96m true[0m[96m nature[0m[96m or[...." Any idea how to fix this, please ?