WongKinYiu / yolov7

Implementation of paper - YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors
GNU General Public License v3.0
13.43k stars 4.23k forks source link

YOLOv7 in tensorflow.js #885

Open SkalskiP opened 2 years ago

SkalskiP commented 2 years ago

Hi! Are there any plans to support model export to tensorflow.js format?

ianZzzzzz commented 2 years ago

we are trying to deploy on web,by convert model into onnx format

SkalskiP commented 2 years ago

So it is a different strategy than YOLOv5 had?

w-okada commented 2 years ago

I created the demo of yolov7 onnx runtime web demo.

https://w-okada.github.io/yolov7-onnx-test/

https://user-images.githubusercontent.com/48346627/204119532-1851a6b9-ba7b-4cad-930f-28cc74818187.mp4

juanjaho commented 1 year ago

Hello,

I have created a Progressive Web App using Next.js based on the YOLOv7-tiny model at:

Feel free to explore :)

hugozanini commented 1 year ago

Hi, @SkalskiP

I have just created a YOLOv7 version using tensorflow.js. I hope it helps you :)

Repo | Demo

Qries
daniil-777 commented 1 year ago

Hey! I tried to convert the onnx model to tfjs and I got the error. Below I explain it in details:

First, I got the onnx model by running the script you provided with:

python export.py --weights yolov7-tiny.pt --grid --end2end --simplify \ --topk-all 100 --iou-thres 0.65 --conf-thres 0.35 --img-size 640 640 --max-wh 640

Then, I have converted onnx model of yolov7 to tfjs following this code:

import tensorflow as tf
import onnx
from scc4onnx import order_conversion
from onnxsim import simplify
from onnx_tf.backend import prepare
import onnxruntime as rt

onnx_model = onnx.load('.../file.onnx')
input_name = onnx_model.graph.input[0].name

onnx_model = order_conversion(
    onnx_graph=onnx_model,
    input_op_names_and_order_dims={f"{input_name}": [0,2,3,1]},
    non_verbose=True
)

tf_rep = prepare(onnx_model)
tf_model_dir = '......'
tf_rep.export_graph(tf_model_dir)`

Next, I run the tensorflowjs converter:

tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model --signature_name=serving_default --saved_model_tags=serve ".../folder_to_saved_tf" ".../folder_out" "--quantize_uint8"

Error: Uncaught (in promise): Error: Cannot infer the missing size in [-1,0] when there are 0 elements I think that means that there is an error in conversion, but I cannot understand where is the problem. Could you, please, help with the proper conversion to tfjs? @hugozanini how did you do the conversion?