airockchip / rknn-llm

Other
417 stars 36 forks source link

RKNN-LLM Context length expansion #93

Closed vincenzodentamaro closed 2 weeks ago

vincenzodentamaro commented 2 months ago

Because of the lack of embeddings extraction to create a vector space database, I need to increase the context length up to 32k or more if possible. I see that some models such as Qwen2 support up to 128k context. How can I increase it.

Should I have to update the rknn_api.h:

define RKNN_MAX_DIMS 32 / increased maximum dimension of tensor. /

define RKNN_MAX_NUM_CHANNEL 20 / increased maximum channel number of input tensor. /

define RKNN_MAX_NAME_LEN 512 / increased maximum name length of tensor. /

define RKNN_MAX_DYNAMIC_SHAPE_NUM 32768 / increased maximum number of dynamic shape for each input. /

or is it enough to set param.max_context_len = 32768; ?

@airockchip @waydong

waydong commented 2 weeks ago

Hi,, Currently, the max context length supported is 4096.