-
mobileVLM 实现移动端部署的路径是什么呢,我不太了解模型的mobile部署方式,是用什么框架推理呢,mnn或者fastllm可以用吗
-
# 平台(如果交叉编译请再附上交叉编译目标平台):
# RK3588S
# Github版本:
# master分支
# 编译方式:
# cmake .. -DMNN_SEP_BUILD=false -DOPEN_CL=on
-- Use Threadpool, forbid openmp
-- >>>>>>>>>>>>>
-- MNN BUILD INFO:
…
-
平台: [淘宝小程序MNN插件](https://open.taobao.com/ability#/detail?id=68)
加载完模型后,输入数据`await input.setData`正常无报错,但是在执行`await output.getData("NHWC")`时,就会整个小程序崩溃。
测试结论:只在安卓上出现,iOS不会崩溃。使用MNN模型:https://github…
-
Hi everyone!
I've faced the following problem: after quantization I can't run model inference on python correctly. The logs are 'recover int8 weights error'. What can be the issue here and how to dea…
-
# 平台(如果交叉编译请再附上交叉编译目标平台):
Linux admin-NUC8 5.19.0-41-generic #42~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Tue Apr 18 17:40:00 UTC 2 x86_64 x86_64 x86_64 GNU/Linux
# Github版本:
v2.5.0
https://github.c…
MrZ13 updated
9 months ago
-
Not Found - GET https://registry.npmjs.org/@nut-tree%2fnut-js - Not found
npm ERR! 404
npm ERR! 404 '@nut-tree/nut-js@^3.1.1' is not in this registry.
npm ERR! 404
npm ERR! 404 Note that you ca…
-
我在**Windows11**环境下的**Android Studio**中运行项目,报该错误,以下是详细信息:
`In file included from D:/WorkSpace/AndroidWorkSpace/AI/mnn-llm-qwen-1.8b-apk/android/app/src/main/jni/llm_mnn_jni.cpp:6:
In file included fr…
-
你好,我用torch.fx量化模型,然后将量化模型转换为onnx,遇到的报错如下:
```bash
[19:20:42] :93: These Op Not Support: ONNX::Conv | ONNX::DequantizeLinear | ONNX::QuantizeLinear
Converted Failed!
Traceback (most recent call la…
-
Ubuntu 20.04
MNN版本: 2.4.1, commit: eea434c02
pytorch导出的onnx模型,导出时未使用dynamic input
转换成mnn成功
`./MNNConvert -f ONNX --modelFile ../../data/mnn_rd/model.onnx --MNNModel ../../data/mnn_rd/model.m…
-
在android编译时报错
ld: error: ChatGLM6B/work/ChatGLM-MNN/libs/libMNN.so is incompatible with aarch64linux
ld: error: ChatGLM6B/work/ChatGLM-MNN/libs/libMNN_Express.so is incompatible with aarch64linux
…