Closed leavestylecode closed 9 months ago
I don't have a good idea yet, but you can take a look at Embedding texts that are longer than the model's maximum context length to see if it's helpful for you.
The code in langchain-java is OpenAIEmbeddings#getLenSafeEmbeddings
我想使用langchain + graphQl 来做问答,当Graph的数据节点比较多时,langchain调openAI的prompt就会太长,接口报了超出tokens限制,这种有什么好的解决方式吗?