Closed RedemptionC closed 5 months ago
Did you manually install it? Because it seems you are missing deploying modelfiles.
For that please run;
tlm deploy
This will deploy context system and parameter tunings on codellama:7b
.
Here is a demo for the command;
Thanks for using tlm
! 😊
@yusufcanb yes, I installed it with go install
, and thanks a lot it works!
BTW, while using it, I'm thinking, if it's better to ask user whether or not to execute the command in the suggest mode after you choose "explain", because in current setting, you'll need to type the command manually if you think it's what you want after reading the explanation.
That's actually a very nice idea. I'll try to include it in upcoming releases. Thanks again!
Sad that there is no way to use custom models. The weights are heavy and if a 3B model can do the job, why do you need to install another one.
I also noticed that there is UNIX or Windows command
in the prompt, its not difficult to detect the current os, but using modelfile as an initializer instead of regular api+prompt calls, makes everything less flexible.