yusufcanb / tlm

Local CLI Copilot, powered by CodeLLaMa. 💻🦙
Apache License 2.0
1.15k stars 36 forks source link

how to set up the models? I have installed CodeLlama, but it's not working #8

Closed RedemptionC closed 5 months ago

RedemptionC commented 5 months ago
$ ollama list
NAME                            ID              SIZE    MODIFIED
codellama:7b                    8fdf8f752f6e    3.8 GB  About a minute ago
wizard-vicuna-uncensored:13b    6887722b6618    7.4 GB  2 weeks ago
$ tlm suggest 'echo hello'
[err] error getting suggestion: model 'suggest:7b' not found, try pulling it first
┃ > Thinking... (1.000666ms)
panic: runtime error: index out of range [0] with length 0

goroutine 1 [running]:
github.com/yusufcanb/tlm/suggest.(*Suggest).action(0x140001a85a0, 0x140001e1640)
        /Users/chenggeng/go/pkg/mod/github.com/yusufcanb/tlm@v0.0.0-20240228141854-71b92e9e89fe/suggest/cli.go:51 +0xa78
github.com/urfave/cli/v2.(*Command).Run(0x140001f0580, 0x140001e1640, {0x140001d7a00, 0x2, 0x2})
        /Users/chenggeng/go/pkg/mod/github.com/urfave/cli/v2@v2.27.1/command.go:279 +0x7b0
github.com/urfave/cli/v2.(*Command).Run(0x140001f0f20, 0x140001e1500, {0x1400019e180, 0x3, 0x3})
        /Users/chenggeng/go/pkg/mod/github.com/urfave/cli/v2@v2.27.1/command.go:272 +0x9d0
github.com/urfave/cli/v2.(*App).RunContext(0x1400029a000, {0x100d72550?, 0x140001ae008}, {0x1400019e180, 0x3, 0x3})
        /Users/chenggeng/go/pkg/mod/github.com/urfave/cli/v2@v2.27.1/app.go:337 +0x61c
github.com/urfave/cli/v2.(*App).Run(...)
        /Users/chenggeng/go/pkg/mod/github.com/urfave/cli/v2@v2.27.1/app.go:311
main.main()
        /Users/chenggeng/go/pkg/mod/github.com/yusufcanb/tlm@v0.0.0-20240228141854-71b92e9e89fe/main.go:16 +0x5c
yusufcanb commented 5 months ago

Did you manually install it? Because it seems you are missing deploying modelfiles.

For that please run;

tlm deploy

This will deploy context system and parameter tunings on codellama:7b. Here is a demo for the command;

deploy

Thanks for using tlm! 😊

RedemptionC commented 5 months ago

@yusufcanb yes, I installed it with go install, and thanks a lot it works!

BTW, while using it, I'm thinking, if it's better to ask user whether or not to execute the command in the suggest mode after you choose "explain", because in current setting, you'll need to type the command manually if you think it's what you want after reading the explanation.

yusufcanb commented 5 months ago

That's actually a very nice idea. I'll try to include it in upcoming releases. Thanks again!

remixer-dec commented 5 months ago

Sad that there is no way to use custom models. The weights are heavy and if a 3B model can do the job, why do you need to install another one. I also noticed that there is UNIX or Windows command in the prompt, its not difficult to detect the current os, but using modelfile as an initializer instead of regular api+prompt calls, makes everything less flexible.