Closed MyraBaba closed 2 years ago
See the latest README.md, you can download the all the onnx files from docker hub now.
Lite.AI.ToolKit contains 70+ AI models with 500+ frozen pretrained files now. Most of the files are converted by myself. You can use it through lite::cv::Type::Class syntax, such as lite::cv::detection::YoloV5. More details can be found at Examples for Lite.AI.ToolKit. Note, for Google Drive, I can not upload all the *.onnx files because of the storage limitation (15G).
File | Baidu Drive | Google Drive | Docker Hub | Hub (Docs) |
---|---|---|---|---|
ONNX | Baidu Drive code: 8gin | Google Drive | ONNX Docker v0.1.22.01.08 (28G) | ONNX Hub |
MNN | Baidu Drive code: 9v63 | ❔ | MNN Docker v0.1.22.01.08 (11G) | MNN Hub |
NCNN | Baidu Drive code: sc7f | ❔ | NCNN Docker v0.1.22.01.08 (9G) | NCNN Hub |
TNN | Baidu Drive code: 6o6k | ❔ | TNN Docker v0.1.22.01.08 (11G) | TNN Hub |
docker pull qyjdefdocker/lite.ai.toolkit-onnx-hub:v0.1.22.01.08 # (28G)
docker pull qyjdefdocker/lite.ai.toolkit-mnn-hub:v0.1.22.01.08 # (11G)
docker pull qyjdefdocker/lite.ai.toolkit-ncnn-hub:v0.1.22.01.08 # (9G)
docker pull qyjdefdocker/lite.ai.toolkit-tnn-hub:v0.1.22.01.08 # (11G)
docker pull qyjdefdocker/lite.ai.toolkit-mnn-hub:v0.1.22.01.08 # (11G)
docker pull qyjdefdocker/lite.ai.toolkit-ncnn-hub:v0.1.22.01.08 # (9G)
docker pull qyjdefdocker/lite.ai.toolkit-tnn-hub:v0.1.22.01.08 # (11G)
docker pull qyjdefdocker/lite.ai.toolkit-onnx-hub:v0.1.22.01.08 # (28G)
Secondly, run the container with local share
dir using docker run -idt xxx
. A minimum example will show you as follows.
share
dir in your local device.
mkdir share # any name is ok.
run_mnn_docker_hub.sh
script like:
#!/bin/bash
PORT1=6072
PORT2=6084
SERVICE_DIR=/Users/xxx/Desktop/your-path-to/share
CONRAINER_DIR=/home/hub/share
CONRAINER_NAME=mnn_docker_hub_d
docker run -idt -p ${PORT2}:${PORT1} -v ${SERVICE_DIR}:${CONRAINER_DIR} --shm-size=16gb --name ${CONRAINER_NAME} qyjdefdocker/lite.ai.toolkit-mnn-hub:v0.1.22.01.08
/home/hub/mnn/cv
to your local share
dir.
# activate mnn docker.
sh ./run_mnn_docker_hub.sh
docker exec -it mnn_docker_hub_d /bin/bash
# copy the models to the share dir.
cd /home/hub
cp -rf mnn/cv share/
@DefTruth hi ,
I couldnt find the yolov5face-s-640x640.onnx model. is it added ? Other yolov5face models will be added also ?
@DefTruth hi ,
I couldnt find the yolov5face-s-640x640.onnx model. is it added ? Other yolov5face models will be added also ?
it can be download from Baidu Driver now, but not include into the docker images. i will update the docker images some days latter.
or you can download the nano version of converted YOLO5Face from my demo project:
@DefTruth When we can download yolov5face models ? We cant reach baidu drive.
Best
@DefTruth When we can download yolov5face models ? We cant reach baidu drive.
Best
Hello, currrently, my family and I are celebrating Chinese New Year. When I'm free, I'll upload all the model files to docker hub ~ however, you can download the nano version of yolov5face onnx/mnn/ncnn/tnn model files from my demo project:
you can download the converted ONNX/MNN/NCNN/TNN files of YOLO5Face from docker hub now ~
docker pull qyjdefdocker/lite.ai.toolkit-onnx-hub:v0.1.22.02.02 # (400M) + YOLO5Face
docker pull qyjdefdocker/lite.ai.toolkit-mnn-hub:v0.1.22.02.02 # (213M) + YOLO5Face
docker pull qyjdefdocker/lite.ai.toolkit-ncnn-hub:v0.1.22.02.02 # (197M) + YOLO5Face
docker pull qyjdefdocker/lite.ai.toolkit-tnn-hub:v0.1.22.02.02 # (217M) + YOLO5Face
@DefTruth thanks for models. I make a speed test wit "s" models and ncnn is x4 time , tnn is x2 times slower than mnn and onnx. ncnn should be faster i assume.
@DefTruth Hi I hope all is good at your end.
I saw yolov5m-face.pt model at https://github.com/deepcam-cn/yolov5-face and the size is 169MB . Yours onnx converted one is yolov5face-m-640x640.onnx 87 MB. What are the differences ? Both are same quality or something different ?
Best
@DefTruth
Hi,
I noticed that the yolov5Face models allways return score : 0.98....
same score. Do you know this ? or are you already updated ?
Hi,
I couldn't find the model for std::string onnx_path = "../../../hub/onnx/cv/yolov5s.640-640.v.6.0.onnx";
thx