Hello,
Thank you very much for your work.
It's not really an issue but is it possible to add the option to write text as user input with talk llama ?
Another thing, llama.cpp has an argument to set the llm amount load on multi gpu setup with --tensor-split or -ts argument, it is possible to add this argument on talk-llama ?
'text as input' - is in my todo list.
tensor-split - maybe sometime in future. Or you can examine code yourself and open a PR with your suggestions. It will help to speed up process.
Hello, Thank you very much for your work. It's not really an issue but is it possible to add the option to write text as user input with talk llama ? Another thing, llama.cpp has an argument to set the llm amount load on multi gpu setup with --tensor-split or -ts argument, it is possible to add this argument on talk-llama ?