Open audreyeternal opened 1 month ago
@audreyeternal The DeepSeek-V2 API currently only supports 32K context length (input + output) with a maximum of 4K tokens for the output.
PS: Although the DeepSeek-V2 model supports 128K context length, we have restricted the API to 32K to ensure efficiency in our services.
I followed the instruction in README about how to utilize deepseek in langchain:
However, seems like the
max_tokens
is still restricted to 4k and an error will be raised when intending to integrate the model into the chain and invoke it: