Closed mikestaub closed 1 year ago
Customized fine-tuned models for a user based on their knowledge base are an interesting direction, as the context one can squeeze into a single prompt is limited to ±4000 tokens (and in practice less than that). It's not impossible in theory (although 6x more expensive), but at the moment out of my scope. The best I can offer right now is the context contained in the note that is being used for chatting with Jarvis.
I'm filing it as a feature, if anyone will want to work on it at some point.
Is it possible to train the plugin on all the notes in the notebook so it has context?
https://medium.com/@rdcolema7/deploying-gpt-2-models-in-custom-applications-f8117e482837