ztxz16 / fastllm

纯c++的全平台llm加速库,支持python调用,chatglm-6B级模型单卡可达10000+token / s,支持glm, llama, moss基座,手机端流畅运行
Apache License 2.0
3.31k stars 339 forks source link

get_prom方法中的fastllm_lib.make_history_llm_model()方法做了什么 #295

Open zhzspace opened 1 year ago

zhzspace commented 1 year ago

有人知道下面这段代码中fastllm_lib.make_history_llm_model()方法做了什么吗?

 def get_prompt(self,
                   query: str,
                   history: List[Tuple[str, str]] = None) -> str:
        if (not(history)):
            history = [];
        prompt = "";
        for i, (old_query, response) in enumerate(history):
            prompt = fastllm_lib.make_history_llm_model(self.model, prompt.encode(), i, old_query.encode(), response.encode()).decode();
        prompt = fastllm_lib.make_input_llm_model(self.model, prompt.encode(), len(history), query.encode()).decode();
        return prompt;
TylunasLi commented 1 year ago

make_history_llm_model 构造历史对话(会话上文), make_input_llm_model 构造本轮的对话输入 这些都可以通过看 pytools.cpp -> basellm.cpp 找到的

zhzspace commented 1 year ago

好的,多谢解答,我去看看pytools.cpp -> basellm.cpp