i-am-bee / bee-api

API backend for Bee
Apache License 2.0
20 stars 8 forks source link

fix(code-interpreter): use llama for preprocessing #22

Closed JanPokorny closed 4 weeks ago

JanPokorny commented 1 month ago

image It does this in certain cases, I have to improve the prompt.

jezekra1 commented 4 weeks ago

Can we make this configurable through env variable so that you can switch between llama and granite?

Also, does it make sense for llama to fix code written by llama?

JanPokorny commented 4 weeks ago

@jezekra1 1) Potentially, but why? Easier rollback if this one shows issues? There are known issues with Granite (like mangling certain filenames) so I don't think we want to have it as an option. 2) Yes, LLM output can often improved by just feeding it in again and saying "fix your mistakes", so I don't see the issue with this being the same LLM