Open lastrosade opened 5 months ago
Sure. I'll try to get it running today. s/b easy. I added Anthropic as a worldsim.main init var, I'll do the same. pbly add to beta. Can pbly backport to main, might take a bit to get to that. I can pbly merge beta at the same time, it's looking pretty solid.
worldsim.main(W)
Oh wow, thanks a ton!
Well, initial round trip was trivial, but text quality isn't good. I'll keep looking into it later today. I suspect it is a problem with chat-template formatting.
Pushed a commit with some llama.cpp code. Works, but at least w LLama3-8B text quality is poor. Ideas welcome. I'm doing something stupid.
I still have not gotten around to trying it, It seems I have a bit more setting up to do then spinning a llama.cpp server
Sorry. If/when I get this performing well enough I'll spend more time packaging for a simpler install.
Will you consider supporting the llama.cpp server API for inference?