MichelNivard / gptstudio

GPT RStudio addins that enable GPT assisted coding, writing & analysis
https://michelnivard.github.io/gptstudio/
Other
905 stars 109 forks source link

[FR]: save chat history in a json for training #200

Closed MichelNivard closed 5 months ago

MichelNivard commented 6 months ago

What would you like to have?

Hi, good to be back :), in regular use I find that if the (local) models I use sometimes make errors, but then when I tell it it made an error and what the error was, it self corrects. I have had like at least 100-150 conversations like it. I occurs to me these would be excellent for fin tuning the model not to make those mistakes again.

So I want to start saving those to train the local model "overnight". My goals is this loop: use a light local model, help it out when it errors or dreams up packages that do not exist, if it then self corrects based on my feedback save the full conversations. Then when I have ~20-40 conversations fine-tune the model over those overnight, finetune on these convo's directly, but also use LLAMA3 to generate new training data about the functions and packages used in the chats "generate 5 common user questions about 'geom_smooth()' and generate answers to each". This would (hopefully) create an iterative loop where the small model get trained in a a user/task specific manner. I'd use parameter efficient training where it runs on my macbook overnight.

so what I need from gptstudio:

  1. a way to export chats either automatically or with code

Then, and this is a bit of a stretch I know... It would really help this process if when the local model fails i could hit a "get help from the cloud" button, which just would feed the ongoing convo to LLAMA 3, GPT-3 or Claude3 and let it suggest the correct answer.

so what I'd ned from gptstudio

2 a "help me out here" button that falls back to a way stronger model mid chat.

I am currently working on the backend that would extract all functions and packages mentioned in ppl's chats in an anonymity preserving way, and based on those would generate synthetic training data from models in the cloud (LLama 3 and Mixtral), in order to support local user specific training.

Code of Conduct

MichelNivard commented 6 months ago

Thinking about this more, I might just need to write a separate package for this, based on the bones of gptstudio...

calderonsamuel commented 6 months ago

We have a chat history functionality. We could add an export button for the chat collection or individually. The second one might be out of scope 🤔 I imagine the end goal is to have a better model for working specifcally with R?

calderonsamuel commented 5 months ago

Right now, you can use gptstudio:::read_conversation_history() to get the whole chat history as an R object or gptstudio:::conversation_history_path() to get the filepath. These are non exported functions and might change in the future tho. Let me know if that closes this one.