bakks / butterfish

A shell with AI superpowers
https://butterfi.sh
MIT License
328 stars 31 forks source link

[Bug] panic on interface conversion when using the verbose log option #33

Closed jtslear closed 4 months ago

jtslear commented 4 months ago

Problem Description

When spinning up butterfish with the verbose option, such as:

butterfish shell -u 'http://localhost:8080/v1' -m llama3-8b-instruct -v -A

We land with a stacktrace from the logging mechanism it appears. Output reformatted for readability:

 projects % butterfish shell -u 'http://localhost:8080/v1' -m llama3-8b-instruct -v -A
Logging to /var/tmp/butterfish.log
 projects % PS1=$'%{\033Q%}'$PS1$'šŸ %{ %?\033R%} '
 projects % šŸ  cd scratchpad
 scratchpad % šŸ  Hi.
panic: interface conversion: interface {} is []string, not string
goroutine 11 [running]:
github.com/bakks/butterfish/butterfish.LogCompletionRequest({{0x14000176d80, 0x12}, {0x104f54d60, 0x1400000f710}, {0x0, 0x0}, 0x800, 0x3f333333, 0x0, 0x0, ...}) /Users/skarbek/projects/pkg/mod/github.com/bakks/butterfish@v0.2.13/butterfish/gpt.go:120 +0x188
github.com/bakks/butterfish/butterfish.(*GPT).InstructCompletionStream(0x1400006a120, 0x140001b6fd0, {0x1050687a8, 0x140000afb00}) /Users/skarbek/projects/pkg/mod/github.com/bakks/butterfish@v0.2.13/butterfish/gpt.go:402 +0x120
github.com/bakks/butterfish/butterfish.(*GPT).CompletionStream(0x1400006a120, 0x140001b6fd0, {0x1050687a8, 0x140000afb00}) /Users/skarbek/projects/pkg/mod/github.com/bakks/butterfish@v0.2.13/butterfish/gpt.go:366 +0x134
github.com/bakks/butterfish/butterfish.CompletionRoutine(0x140001b6fd0, {0x10506ccb8, 0x1400006a120}, {0x1050687a8, 0x140000afb00}, 0x1400002a360, {0x104d0bdcb?, 0x1045220b8?}, {0x104d0bde1, 0xb}, ...) /Users/skarbek/projects/pkg/mod/github.com/bakks/butterfish@v0.2.13/butterfish/shell.go:1732 +0xa4
  created by github.com/bakks/butterfish/butterfish.(*ShellState).SendPrompt in goroutine 1 /Users/skarbek/projects/pkg/mod/github.com/bakks/butterfish@v0.2.13/butterfish/shell.go:1715 +0x43c
bakks commented 4 months ago

Thanks for bug report, will take a look. Anything i need to know about how you're running llama? (curious in general for your setup)

jtslear commented 4 months ago

Hey @bakks! I feel like I'm late to the AI party. So there could be something incorrect with my configuration or setup, but I'm learning and exploring.

Currently I'm running https://github.com/mudler/LocalAI ...locally. I am using their local build option documented here in order to take full advantage of the M2 chip of my workstation. Other than that, pretty basic. I forget how I landed on the model that I chose. There's still a bunch of stuff related to models and configuration that I've yet to complete learning.

Butterfish appears to play quite nicely with it overall.

bakks commented 4 months ago

Ok this should be fixed by 129ced22398f2e1aa22109884a1f14b7c520f064, please try it out. I will cut a new release soon.

Some FYI details -- I was a little confused here because I didn't expect you to be on the code path where you saw that error. Because the model you're using has an -instruct suffix, that mean butterfish is sending a request to /completions rather than /chat/completions on your localai server. This is hardcoded based on the OpenAI interface. I was surprised this is working at all šŸ˜ƒ. But looking closer, it is working but it's not actually sending the system prompt when you do a shell prompt, so the completions may be lower quality than you would get otherwise. Maybe try it with a model not suffixed with -instruct and see if it's different.

Also btw if you're so inclined, if you want to write some instructions on making butterfish work with LocalAI i would love to merge and include that in the readme, I think it would help out other people. I haven't had time to look at this closely.

jtslear commented 4 months ago

@bakks

Ok this should be fixed by https://github.com/bakks/butterfish/commit/129ced22398f2e1aa22109884a1f14b7c520f064, please try it out. I will cut a new release soon.

This indeed works now!

Maybe try it with a model not suffixed with -instruct and see if it's different.

These are some of the many things I'm still learning at the moment.

...if you want to write some instructions on making butterfish work with LocalAI...

Certainly will!


You appear to fix the heart of this issue, we can close this out.