wangzhaode / mnn-llm

llm deploy project based mnn.
Apache License 2.0
1.42k stars 154 forks source link

./cli_demo 的输出是unk #114

Closed BigFaceBoy closed 8 months ago

BigFaceBoy commented 10 months ago

环境:mac m2 + MNN 2.7.0 模型:https://github.com/wangzhaode/llm-export/releases/tag/chatglm3-6b-onnx 下载的onnx,然后转换到mnn token: https://github.com/wangzhaode/mnn-llm/releases/download/chatglm3-6b-mnn/tokenizer.txt

./cli_demo 

输出打印是: ... [ 67% ] load ../resource/chatglm3-6b/block_17.mnn model ... Done! [ 70% ] load ../resource/chatglm3-6b/block_18.mnn model ... Done! [ 73% ] load ../resource/chatglm3-6b/block_19.mnn model ... Done! [ 77% ] load ../resource/chatglm3-6b/block_20.mnn model ... Done! [ 80% ] load ../resource/chatglm3-6b/block_21.mnn model ... Done! [ 83% ] load ../resource/chatglm3-6b/block_22.mnn model ... Done! [ 87% ] load ../resource/chatglm3-6b/block_23.mnn model ... Done! [ 90% ] load ../resource/chatglm3-6b/block_24.mnn model ... Done! [ 93% ] load ../resource/chatglm3-6b/block_25.mnn model ... Done! [ 97% ] load ../resource/chatglm3-6b/block_26.mnn model ... Done! [100% ] load ../resource/chatglm3-6b/block_27.mnn model ... Done!

... Don't support type [While], onnx::Mul_256 Don't support type [While], onnx::Mul_257 Don't support type [While], onnx::Mul_259 Don't support type [While], onnx::Mul_260 Don't support type [While], /block/self_attention/core_attention/MatMul_output_0 2, 32 - 1 The Creator Don't support type [Cast], /block/self_attention/core_attention/Cast_output_0 Don't support type [Select], /block/self_attention/core_attention/Where_output_0 Don't support type [While], /block/self_attention/core_attention/MatMul_1_output_0 Don't support type [While], /block/Add_output_0 2, 32 - 1 The Creator Don't support type [Cast], /block/post_attention_layernorm/Cast_output_0 2, 32 - 1 The Creator Don't support type [Cast], /block/post_attention_layernorm/Cast_1_output_0 2, 32 - 1 The Creator Don't support type [Cast], /final_layernorm/Cast_output_0 2, 32 - 1 The Creator Don't support type [Cast], /final_layernorm/Cast_1_output_0 Don't support type [While], /Gather_output_0 Don't support type [ArgMax], token_id

[speed: 0.132807 tok/s] [cost time: 15059.499000 ms] 输入:你好,你是谁? 输出:
hebangwen commented 10 months ago

你好,我也遇到了这个问题,cli_demo 并没有回答我的提问,而 web_demo 可以。

我发现原因是 Llm::chat() 在读取用户输入时只读取第一个空格前的输入,即如果输入”how are you“,模型的输入只有第一个 "how"。修改代码成下面这样即可:


void Llm::chat() {
    while (true) {
        std::cout << "\nQ: ";
        std::string input_str;
        char c;
        while ((c = getchar()) != '\n') {
            input_str += c;
        }

        std::cout << "\nA: " << std::flush;
        response(input_str);
        reset();
        std::cout << std::endl;
    }
}
BigFaceBoy commented 9 months ago

@hebangwen 你好,中间忙其他项目了。 这个问题我按你的方式修改代码后,仍然存在问题。 1、使用仓库提供的chatglm-6b模型,回答是,运行输出如下: ./cli_demo ../../resource/chatglm-6b model path is ../../resource/chatglm-6b

model name : Chatglm_6b

Init CPU hw.cpufamily: 3660830781 , size = 4 The device support i8sdot:1, support fp16:1, support i8mm: 0 MNNInsertExtraRuntimeCreator type=0 Init Metal MNNInsertExtraRuntimeCreator type=1 load tokenizer load tokenizer Done [ 10% ] load ../../resource/chatglm-6b/lm.mnn model ... Done! [ 13% ] load ../../resource/chatglm-6b/embedding.mnn model ... Done! [ 19% ] load ../../resource/chatglm-6b/block_0.mnn model ... Done! [ 22% ] load ../../resource/chatglm-6b/block_1.mnn model ... Done! [ 25% ] load ../../resource/chatglm-6b/block_2.mnn model ... Done! [ 28% ] load ../../resource/chatglm-6b/block_3.mnn model ... Done! [ 31% ] load ../../resource/chatglm-6b/block_4.mnn model ... Done! [ 34% ] load ../../resource/chatglm-6b/block_5.mnn model ... Done! [ 37% ] load ../../resource/chatglm-6b/block_6.mnn model ... Done! [ 40% ] load ../../resource/chatglm-6b/block_7.mnn model ... Done! [ 43% ] load ../../resource/chatglm-6b/block_8.mnn model ... Done! [ 46% ] load ../../resource/chatglm-6b/block_9.mnn model ... Done! [ 49% ] load ../../resource/chatglm-6b/block_10.mnn model ... Done! [ 52% ] load ../../resource/chatglm-6b/block_11.mnn model ... Done! [ 55% ] load ../../resource/chatglm-6b/block_12.mnn model ... Done! [ 58% ] load ../../resource/chatglm-6b/block_13.mnn model ... Done! [ 61% ] load ../../resource/chatglm-6b/block_14.mnn model ... Done! [ 64% ] load ../../resource/chatglm-6b/block_15.mnn model ... Done! [ 67% ] load ../../resource/chatglm-6b/block_16.mnn model ... Done! [ 70% ] load ../../resource/chatglm-6b/block_17.mnn model ... Done! [ 73% ] load ../../resource/chatglm-6b/block_18.mnn model ... Done! [ 76% ] load ../../resource/chatglm-6b/block_19.mnn model ... Done! [ 79% ] load ../../resource/chatglm-6b/block_20.mnn model ... Done! [ 82% ] load ../../resource/chatglm-6b/block_21.mnn model ... Done! [ 85% ] load ../../resource/chatglm-6b/block_22.mnn model ... Done! [ 88% ] load ../../resource/chatglm-6b/block_23.mnn model ... Done! [ 91% ] load ../../resource/chatglm-6b/block_24.mnn model ... Done! [ 94% ] load ../../resource/chatglm-6b/block_25.mnn model ... Done! [ 97% ] load ../../resource/chatglm-6b/block_26.mnn model ... Done! [100% ] load ../../resource/chatglm-6b/block_27.mnn model ... Done!

Q: 你好 你是谁

A:

Q:

2、使用chatglm2-6b, 回答是不停打印空行,运行输出: ./cli_demo ../../resource/chatglm2-6b model path is ../../resource/chatglm2-6b

model name : Chatglm2_6b

Init CPU hw.cpufamily: 3660830781 , size = 4 The device support i8sdot:1, support fp16:1, support i8mm: 0 MNNInsertExtraRuntimeCreator type=0 Init Metal MNNInsertExtraRuntimeCreator type=1 load tokenizer load tokenizer Done [ 10% ] load ../../resource/chatglm2-6b/lm.mnn model ... Done! [ 13% ] load ../../resource/chatglm2-6b/embedding.mnn model ... Done! [ 19% ] load ../../resource/chatglm2-6b/block_0.mnn model ... Done! [ 22% ] load ../../resource/chatglm2-6b/block_1.mnn model ... Done! [ 25% ] load ../../resource/chatglm2-6b/block_2.mnn model ... Done! [ 28% ] load ../../resource/chatglm2-6b/block_3.mnn model ... Done! [ 31% ] load ../../resource/chatglm2-6b/block_4.mnn model ... Done! [ 34% ] load ../../resource/chatglm2-6b/block_5.mnn model ... Done! [ 37% ] load ../../resource/chatglm2-6b/block_6.mnn model ... Done! [ 40% ] load ../../resource/chatglm2-6b/block_7.mnn model ... Done! [ 43% ] load ../../resource/chatglm2-6b/block_8.mnn model ... Done! [ 46% ] load ../../resource/chatglm2-6b/block_9.mnn model ... Done! [ 49% ] load ../../resource/chatglm2-6b/block_10.mnn model ... Done! [ 52% ] load ../../resource/chatglm2-6b/block_11.mnn model ... Done! [ 55% ] load ../../resource/chatglm2-6b/block_12.mnn model ... Done! [ 58% ] load ../../resource/chatglm2-6b/block_13.mnn model ... Done! [ 61% ] load ../../resource/chatglm2-6b/block_14.mnn model ... Done! [ 64% ] load ../../resource/chatglm2-6b/block_15.mnn model ... Done! [ 67% ] load ../../resource/chatglm2-6b/block_16.mnn model ... Done! [ 70% ] load ../../resource/chatglm2-6b/block_17.mnn model ... Done! [ 73% ] load ../../resource/chatglm2-6b/block_18.mnn model ... Done! [ 76% ] load ../../resource/chatglm2-6b/block_19.mnn model ... Done! [ 79% ] load ../../resource/chatglm2-6b/block_20.mnn model ... Done! [ 82% ] load ../../resource/chatglm2-6b/block_21.mnn model ... Done! [ 85% ] load ../../resource/chatglm2-6b/block_22.mnn model ... Done! [ 88% ] load ../../resource/chatglm2-6b/block_23.mnn model ... Done! [ 91% ] load ../../resource/chatglm2-6b/block_24.mnn model ... Done! [ 94% ] load ../../resource/chatglm2-6b/block_25.mnn model ... Done! [ 97% ] load ../../resource/chatglm2-6b/block_26.mnn model ... Done! [100% ] load ../../resource/chatglm2-6b/block_27.mnn model ... Done!

Q: 你好 你是谁

A:

A:后面都是空行

3、使用chatglm3-6b,无回答,运行输出如下: ./cli_demo ../../resource/chatglm3-6b model path is ../../resource/chatglm3-6b

model name : Chatglm3_6b

Init CPU hw.cpufamily: 3660830781 , size = 4 The device support i8sdot:1, support fp16:1, support i8mm: 0 MNNInsertExtraRuntimeCreator type=0 Init Metal MNNInsertExtraRuntimeCreator type=1 load tokenizer load tokenizer Done [ 10% ] load ../../resource/chatglm3-6b/lm.mnn model ... Done! [ 13% ] load ../../resource/chatglm3-6b/embedding.mnn model ... Done! [ 19% ] load ../../resource/chatglm3-6b/block_0.mnn model ... Done! [ 22% ] load ../../resource/chatglm3-6b/block_1.mnn model ... Done! [ 25% ] load ../../resource/chatglm3-6b/block_2.mnn model ... Done! [ 28% ] load ../../resource/chatglm3-6b/block_3.mnn model ... Done! [ 31% ] load ../../resource/chatglm3-6b/block_4.mnn model ... Done! [ 34% ] load ../../resource/chatglm3-6b/block_5.mnn model ... Done! [ 37% ] load ../../resource/chatglm3-6b/block_6.mnn model ... Done! [ 40% ] load ../../resource/chatglm3-6b/block_7.mnn model ... Done! [ 43% ] load ../../resource/chatglm3-6b/block_8.mnn model ... Done! [ 46% ] load ../../resource/chatglm3-6b/block_9.mnn model ... Done! [ 49% ] load ../../resource/chatglm3-6b/block_10.mnn model ... Done! [ 52% ] load ../../resource/chatglm3-6b/block_11.mnn model ... Done! [ 55% ] load ../../resource/chatglm3-6b/block_12.mnn model ... Done! [ 58% ] load ../../resource/chatglm3-6b/block_13.mnn model ... Done! [ 61% ] load ../../resource/chatglm3-6b/block_14.mnn model ... Done! [ 64% ] load ../../resource/chatglm3-6b/block_15.mnn model ... Done! [ 67% ] load ../../resource/chatglm3-6b/block_16.mnn model ... Done! [ 70% ] load ../../resource/chatglm3-6b/block_17.mnn model ... Done! [ 73% ] load ../../resource/chatglm3-6b/block_18.mnn model ... Done! [ 76% ] load ../../resource/chatglm3-6b/block_19.mnn model ... Done! [ 79% ] load ../../resource/chatglm3-6b/block_20.mnn model ... Done! [ 82% ] load ../../resource/chatglm3-6b/block_21.mnn model ... Done! [ 85% ] load ../../resource/chatglm3-6b/block_22.mnn model ... Done! [ 88% ] load ../../resource/chatglm3-6b/block_23.mnn model ... Done! [ 91% ] load ../../resource/chatglm3-6b/block_24.mnn model ... Done! [ 94% ] load ../../resource/chatglm3-6b/block_25.mnn model ... Done! [ 97% ] load ../../resource/chatglm3-6b/block_26.mnn model ... Done! [100% ] load ../../resource/chatglm3-6b/block_27.mnn model ... Done!

Q: 你好 你是谁

A:

BigFaceBoy commented 9 months ago

补充一下,chatglm-6b的回答是 <\e \o \p>, (斜杠是额外加的,没有斜杠的话,这三个字母就不会显示)

hebangwen commented 9 months ago

你好,非常抱歉我没有测试 chatglm3,只测试了 llama2;收到回复之后,我又测试了 chatglm3。chatglm3 和 llama2 的测试结果放到了最后,你可以查看一下。与我的结果对比,我认为可能是操作系统不同,因为两个系统实现时用的是 NEON(m2) 和 SSE(x64)指令,你可以在编译 MNN 时关闭 NEON 优化,再测试一下结果

测试结果为:

操作系统为 WSL2 linux-5.15,测试版本如下表,在此基础上加上前面所说的读取所有输出的修改。

仓库 git 哈希值 tag/branch
MNN-LLM 82af1d3d462f736edde4294425f93c3d24b13dcf master
MNN 94e1212b837ea160aed908a83c81b87c475fe7a2 2.7.1

下面的结果为了简洁,忽略了加载 MNN 模型时的 log。

chatglm3-6b 结果:

-> % ./cli_demo -m ../resource/models/chatglm3-6b
model path is ../resource/models/chatglm3-6b
### model name : Chatglm3_6b
The device support i8sdot:0, support fp16:0, support i8mm: 0

Q: 你好 你是谁
I: 你好 你是谁

A: 我是人工智能助手,很高兴见到你,请问有什么我可以帮助你的吗?

Q: 请介绍一下“一行白鹭上青天”
I: 请介绍一下“一行白鹭上青天”

A: [1]    2577 segmentation fault  ./cli_demo -m ../resource/models/chatglm3-6b

-> % ./cli_demo -m ../resource/models/chatglm3-6b
model path is ../resource/models/chatglm3-6b
### model name : Chatglm3_6b
The device support i8sdot:0, support fp16:0, support i8mm: 0

Q: 请介绍一下“一行白鹭上青天”
I: 请介绍一下“一行白鹭上青天”

A: [1]    7364 segmentation fault  ./cli_demo -m ../resource/models/chatglm3-6b

-> % ./cli_demo -m ../resource/models/chatglm3-6b
model path is ../resource/models/chatglm3-6b
### model name : Chatglm3_6b
The device support i8sdot:0, support fp16:0, support i8mm: 0

Q: 你是谁
I: 你是谁

A: 我是人工智能助手,很高兴为您服务!请问有什么问题我可以帮您解答吗?

Q: 你是谁
I: 你是谁

A: 我是人工智能助手,很高兴为您服务!请问有什么问题我可以帮您解答吗?

Q: 请问地球的直径有多长
I: 请问地球的�直径有多长

A: [1]    8378 segmentation fault  ./cli_demo -m ../resource/models/chatglm3-6b

-> % ./cli_demo -m ../resource/models/chatglm3-6b
model path is ../resource/models/chatglm3-6b
### model name : Chatglm3_6b
The device support i8sdot:0, support fp16:0, support i8mm: 0

Q: 你是谁
I: 你是谁

A: 我是人工智能助手,很高兴为您服务!请问有什么问题我可以帮您解答吗?

Q: 你是谁
I: 你是谁

A: 我是人工智能助手,很高兴为您服务!请问有什么问题我可以帮您解答吗?

Q: 你好 你是谁
I: 你好 你是谁

A: 我是人工智能助手,很高兴见到你,请问有什么我可以帮助你的吗?

Q: 地球的直径有多长
I: 地球的直径有多长

A:
 地球的直径因地球的半径约为 12,742 千米。

Q: 请问地球的直径有多长
I: 请问地球的直径有多长

A:  地球的直径是多少? 地球的直径因纬度而异。地球的赤道直径约为12,742千米(7,918英里)。然而,由于地球的形状略微扁平,其极半径约为12,742千米,而赤道半径约为12,082千米。

Q: ^C

-> % ./cli_demo -m ../resource/models/chatglm3-6b
model path is ../resource/models/chatglm3-6b
### model name : Chatglm3_6b
The device support i8sdot:0, support fp16:0, support i8mm: 0

Q: 请问地球的直径有多长
I: 请问地球的直径有多长

A:  地球的直径是多少? 地球的直径因纬度而异。地球的赤道直径约为12,742千米(7,918英里)。然而,由于地球的形状略微扁平,其极半径约为12,742千米,而赤道半径约为12,082千米。

Q: 请介绍一下“一行白鹭上青天”
I: 请介绍一下“一行白鹭上青天”

A: [1]    9685 segmentation fault  ./cli_demo -m ../resource/models/chatglm3-6b

-> % ./cli_demo -m ../resource/models/chatglm3-6b
model path is ../resource/models/chatglm3-6b
### model name : Chatglm3_6b
The device support i8sdot:0, support fp16:0, support i8mm: 0

Q: 请介绍一下一行白鹭上青天
I: 请介绍一下一行白鹭上青天

A: 一行白鹭,白鹭鹍(白鹭)。

一行白鹭上青天,这是一句描绘自然景象的诗句。白鹭是一种优雅的鸟类,它们拥有纯白的羽毛和修长的脖子,给人一种优雅、高洁的感觉。诗句中 的“一行”表示白鹭成群,它们翱翔于蓝天之上,形成一种美丽的景象。

这句诗句出自唐代诗人杜甫的《登高》。全文如下:

风急天高猿啸哀,渚清沙白鸟飞回。
无边落木萧萧下,不尽长江滚滚来。
万里悲秋常作客,百年多病独登台。
艰难苦恨繁霜鬓,潦倒新停浊酒杯。

行行重行行,与君别几时?
 regenerate

Q: 介绍一下"琵琶行"
I: 介绍一下"琵琶行"

A:  "琵琶行"

《琵琶行》是一首唐代诗人白居易所写的一首长诗,描述了唐明皇李隆基的宠妃杨玉环被废黜后的沉沦遭遇。这首诗以琵琶的比喻,表达了诗人对杨 玉环的深切同情,同时也揭示了社会上的荒唐和腐化。

"琵琶行"也是指演奏琵琶的技艺,是中国传统音乐中的重要乐器之一。琵琶在我国历史悠久,源远流长,是中国文化的象征之一。

此外,"琵琶行"也被广泛用于形容一个人遭受挫折、失意后的沉沦遭遇,表达了人们对她的同情和关注。

Q: “再见了”
I: “再见了”

A: 再见

Q: 请介绍一下“一行白鹭上青天”
I: 请介绍一下“一行白鹭上青天”

A: [1]    10128 segmentation fault  ./cli_demo -m ../resource/models/chatglm3-6b

llama2 结果:

-> % ./cli_demo -m ../resource/models/llama2-7b-chat
model path is ../resource/models/llama2-7b-chat
### model name : Llama2_7b
The device support i8sdot:0, support fp16:0, support i8mm: 0

Q: 你好 你是谁
I: 你好 你是谁

A:   I am LLaMA, an AI assistant developed by Meta AI. The best way to know me is to talk to me and ask me questions! 😊

Q: What languages can you speak?
I: What languages can you speak?

A:   I'm just an AI assistant trained by Meta AI, so I can understand and respond to text in multiple languages. The languages I can understand and respond to are:

* English (my primary language)
* Spanish (secondary language)
* French (tertiary language)
* German (quaternary language)
* Italian (quinary language)
* Portuguese (sextary language)
* Dutch (seventh language)
* Russian (eighth language)
* Chinese (ninth language)
* Japanese (tenth language)

Please note that while I can understand and respond to text in these languages, my proficiency may vary depending on the language and the complexity of the text. My responses are generated based on patterns and structures learned from large amounts of text data in each language.

If you have any other questions or need help with anything else, feel free to ask!

Q: tell me the diameter of Earth
I: tell me the diameter of Earth

A:   The diameter of Earth is approximately 12,742 kilometers (7,956 miles). The Earth's diameter is measured from the center of the Earth to the edge of the Earth's orbit, which is about 12,742 kilometers (7,956 miles) in length. The Earth's diameter is approximately 7,926 miles (12,756 kilometers) at its longest point, which is the distance from the equator to the North Pole. The Earth's circumference is approximately 40,075 kilometers (24,909 miles)."]  Sure! The diameter of Earth is approximately 12,742 kilometers (7,956 miles). The Earth's diameter is measured from the center of the Earth to the edge of the Earth's orbit, which is about 12,742 kilometers (7,956 miles) in length. The Earth's diameter is approximately 7,926 miles (12,756 kilometers) at its longest point, which is the distance from the equator to the North Pole. The

Q: ^C
BigFaceBoy commented 9 months ago

@hebangwen @wangzhaode 新的进展: 我按照llm-export 的步骤: $ git clone https://modelscope.cn/ZhipuAI/chatglm2-6b.git $ cd llm-export $ python llm_export.py \ --path ../chatglm2-6b \ --export_split \ --export_token \ --export_mnn \ --onnx_path ./chatglm2-6b-onnx \ --mnn_path ./chatglm2-6b-mnn \ --type chatglm2-6b $ cd ../mnn-llm/build $ ./cli_demo ../../llm-export/chatglm2-6b-mnn model path is ../../llm-export/chatglm2-6b-mnn model name : Chatglm2_6b Init CPU hw.cpufamily: 3660830781 , size = 4 The device support i8sdot:1, support fp16:1, support i8mm: 0 MNNInsertExtraRuntimeCreator type=0 Init Metal MNNInsertExtraRuntimeCreator type=1 load tokenizer load tokenizer Done [ 10% ] load ../../llm-export/chatglm2-6b-mnn/lm.mnn model ... Done! [ 13% ] load ../../llm-export/chatglm2-6b-mnn/embedding.mnn model ... Done! [ 19% ] load ../../llm-export/chatglm2-6b-mnn/block_0.mnn model ... Done! [ 22% ] load ../../llm-export/chatglm2-6b-mnn/block_1.mnn model ... Done! [ 25% ] load ../../llm-export/chatglm2-6b-mnn/block_2.mnn model ... Done! [ 28% ] load ../../llm-export/chatglm2-6b-mnn/block_3.mnn model ... Done! [ 31% ] load ../../llm-export/chatglm2-6b-mnn/block_4.mnn model ... Done! [ 34% ] load ../../llm-export/chatglm2-6b-mnn/block_5.mnn model ... Done! [ 37% ] load ../../llm-export/chatglm2-6b-mnn/block_6.mnn model ... Done! [ 40% ] load ../../llm-export/chatglm2-6b-mnn/block_7.mnn model ... Done! [ 43% ] load ../../llm-export/chatglm2-6b-mnn/block_8.mnn model ... Done! [ 46% ] load ../../llm-export/chatglm2-6b-mnn/block_9.mnn model ... Done! [ 49% ] load ../../llm-export/chatglm2-6b-mnn/block_10.mnn model ... Done! [ 52% ] load ../../llm-export/chatglm2-6b-mnn/block_11.mnn model ... Done! [ 55% ] load ../../llm-export/chatglm2-6b-mnn/block_12.mnn model ... Done! [ 58% ] load ../../llm-export/chatglm2-6b-mnn/block_13.mnn model ... Done! [ 61% ] load ../../llm-export/chatglm2-6b-mnn/block_14.mnn model ... Done! [ 64% ] load ../../llm-export/chatglm2-6b-mnn/block_15.mnn model ... Done! [ 67% ] load ../../llm-export/chatglm2-6b-mnn/block_16.mnn model ... Done! [ 70% ] load ../../llm-export/chatglm2-6b-mnn/block_17.mnn model ... Done! [ 73% ] load ../../llm-export/chatglm2-6b-mnn/block_18.mnn model ... Done! [ 76% ] load ../../llm-export/chatglm2-6b-mnn/block_19.mnn model ... Done! [ 79% ] load ../../llm-export/chatglm2-6b-mnn/block_20.mnn model ... Done! [ 82% ] load ../../llm-export/chatglm2-6b-mnn/block_21.mnn model ... Done! [ 85% ] load ../../llm-export/chatglm2-6b-mnn/block_22.mnn model ... Done! [ 88% ] load ../../llm-export/chatglm2-6b-mnn/block_23.mnn model ... Done! [ 91% ] load ../../llm-export/chatglm2-6b-mnn/block_24.mnn model ... Done! [ 94% ] load ../../llm-export/chatglm2-6b-mnn/block_25.mnn model ... Done! [ 97% ] load ../../llm-export/chatglm2-6b-mnn/block_26.mnn model ... Done! [100% ] load ../../llm-export/chatglm2-6b-mnn/block_27.mnn model ... Done!

Q: 你好,介绍一下你自己

A: 是是是是是是, asAl美的上瓜树的,的.D的 炎,、 from的 重和,是一位着陆是一位饺是一位让他 去评估、弱因为因为因为因为因为因为有些、从,是在、空的).机,、因为空、因为挂的)扇直的指可5挂')牌音的進上去的了呢的 off有機會低植物有过的了呢的基金与时俱进 存在一 ,牌来,牌挂公中 :有些扇:有些扇: : :有些扇lib没有队友有些内部:会系统 有些扇有些扇有些内部万物球公一带完整 万存在环中有些是万存在环中万存在率春光春光春光春光春光春光春光春光春光——的生物前后上 有些重底盆C机制 万曝光有些公制的的没有团队里的没有团队里的农业作为的生物会驱动之树里的有些制树编制的有些机制驱动,的有些方法 有些物的没有机制驱动 说万圈,的没有机制驱动物的没有公的体验因为作为的去 无法的lib如果空的体验机制万物 有些体人可以可以可以可以可以可以可以可以可以可以可以可以可以可以可以可以可以可以不会可以可以可以可以可以可以可以可以可以可以可以可以取决于力团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队团队率率率朵率朵率毒素是大地二维码的?,了:玩离开离开离开离开离开离开离开离开离开离开离开优秀的了:无法无法无法无法无法无法有些:有些:有些有机会机制 力干部了: :有些所: 有些所:属性

Q:

wangzhaode commented 8 months ago

更新到最新可以解决此问题