langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.59k stars 2.15k forks source link

Summarization chain #737

Closed juzarantri closed 1 year ago

juzarantri commented 1 year ago

Using this chain takes lots of time for generating out put i.e., summary

Here is the reference which I used https://js.langchain.com/docs/modules/chains/other_chains/summarization

arronKler commented 1 year ago

try this const chain = loadSummarizationChain(model, { type: 'stuff' }); if your content is not too long

juzarantri commented 1 year ago

Thanks @arronKler for this but what to do if the content is long like 6000 words

arronKler commented 1 year ago

due to the max_token limit of LLM model, SummarizationChain's default behavior is type: 'map_reduce', which means it will resolve each part of your document with calling LLM to do summarize, until reach max iteration set (which is 10) or the token count reduced enough to call final summarize.

so it indeed need time to call summarize to every part of your doc. you can try to pass a prompt parameter to loadSummarizationChain with your own summarize prompt, and your template should like this:

const template = `Write a concise summary of the following with 300 words:

"{text}"

CONCISE SUMMARY:`

const myPrompt = new PromptTemplate({
  template,
  inputVariables: ["text"],
})

loadSummarizationChain(model, { prompt: myPrompt})

this will limit the total summarization words count for each output, but also may reduce summarize precision either.

juzarantri commented 1 year ago

Thanks for your contribution

matijagrcic commented 1 year ago

@arronKler Do you know how can i see the intermediate steps using Langchain JS as I'm not sure my prompt is being applied? I think the default prompt is always applied for some reason.

arronKler commented 1 year ago

@arronKler Do you know how can i see the intermediate steps using Langchain JS as I'm not sure my prompt is being applied? I think the default prompt is always applied for some reason.

you may need this

https://js.langchain.com/docs/production/tracing