Closed jonathonbarton closed 8 months ago
There's a bat file included for installing llama-cpp-python correctly for your windows install. If you're using linux, you can find the install instructions here if you're trying to use llama-cpp-python: https://github.com/abetlen/llama-cpp-python
However I'll add a fix to this so its not required to have unless you're trying to use llama-cpp-python. Thanks for the heads up!
Edit: https://github.com/art-from-the-machine/Mantella/commit/4f1d1de797a27d67fcf84b2cfdf91aca49393983 Let me know if this works. <3
Both batch files open and close instantaneously. I made the code skip that by adding to the beginning of it (so it masquerades as an init file or whatever, expecting the openai to get picked up and... ...that didn't work either. It errored out in the same way. So I __'ed the openai version to JUST leave default. ...and that gave me a slightly different error message
File "C:\Projects\MantellaPathos\main.py", line 2, in <module>
import src.conversation_manager as cm
File "C:\Projects\MantellaPathos\src\conversation_manager.py", line 11, in <module>
import src.language_model as language_models
File "C:\Projects\MantellaPathos\src\language_model.py", line 17, in <module>
LLM_Types["default"] = LLM_Types[default]
And finally... w/r/t llama_cpp and the batch files to 'correctly install' llama_cpp - I already have a local LLM that suits me just fine, I don't feel like need another one. :-)
The openai inference engine is the default, if you changed the reference, it won't find anything to default to.
Other settings from the config.ini that I see...
and
It appears (at first glance) that even when inference_engine is default or openai, llama_cpp is still expected. I did the (somewhat) obvious steps of trying to just pip llama_cpp into my machine (not found) I tried the two batch files in the main directory. They just open and close immediately.