bojone / NBCE

Naive Bayes-based Context Extension
311 stars 22 forks source link

Real-life use #4

Open StrangeTcy opened 1 year ago

StrangeTcy commented 1 year ago

I've tested this approach on a single-language (English) LlaMA, and it worked great, except:

  1. it didn't get the LinkedIn layoff answer right
  2. it didn't output any spaces between words

But the thing that I wonder about is real-life use: when you address a question to an LLM, you don't normally provide the context as well. Is there a way to provide it anyway? Also, is there any specific finetuning procedure that'd make the model better at using this approach?

bojone commented 1 year ago

You can use the nearby text as a query and divide the distant text into multiple shorter contexts through a sliding window.