SeungyounShin / Llama2-Code-Interpreter

Make Llama2 use Code Execution, Debug, Save Code, Reuse it, Access to Internet
687 stars 89 forks source link

working principle #5

Closed lucasjinreal closed 10 months ago

lucasjinreal commented 11 months ago

when does it generates code? from my side the outut doesn't have any code involved..... image

SeungyounShin commented 11 months ago

This is actually the expected behavior of the current version of my project. The code generation relies on in-context learning and only triggers the execution of code when the language learning model (LLM) generates the code.

I'm working on enhancing this feature by training it with data collected from GPT-4. This is an ongoing effort and you can follow its progress in Issue #1.

For now, you may want to start with simple tasks like calculating "10!" and make sure to explicitly state "use code" or specify the library (for example, "use beautifulsoup") in your instructions.

In short, include "use code" in your instructions and start with easier tasks. Please note that the current model is optimized for chat-based applications. I'm in the process of gathering data to support Software Fine-tuning (SFT).

Also, I plan to add examples of best practices for prompting soon.

I hope this clears up your query. Stay tuned for updates on Software Fine-tuning (SFT) and data collection. Let me know if you have any more questions. 😀😀

lucasjinreal commented 11 months ago

@SeungyounShin so the code actually triggered on certrain questions. By how? what kinds of questions will trigger code gen?

SeungyounShin commented 10 months ago

close [this is resolved by finetuning model released