ultralytics / yolov5

YOLOv5 ๐Ÿš€ in PyTorch > ONNX > CoreML > TFLite
https://docs.ultralytics.com
GNU Affero General Public License v3.0
50.67k stars 16.33k forks source link

How to deploy the model (.pt) or convert to tf model(.pb) #118

Closed elvinest closed 4 years ago

elvinest commented 4 years ago

How to deploy .pt file to production environment (web or mobile) or convert to tensorflow model file (.pb)

I trained my own dataset and generated the last.pt file,I can run detect.py perfectly to predict, but I donโ€™t know how to deploy models(.pt) or convert them to tensorflow model files(.pb)

Hope help me! Thanks.

github-actions[bot] commented 4 years ago

Hello @elvinest, thank you for your interest in our work! Please visit our Custom Training Tutorial to get started, and see our Jupyter Notebook Open In Colab, Docker Image, and Google Cloud Quickstart Guide for example environments.

If this is a bug report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you.

If this is a custom model or data training question, please note that Ultralytics does not provide free personal support. As a leader in vision ML and AI, we do offer professional consulting, from simple expert advice up to delivery of fully customized, end-to-end production solutions for our clients, such as:

For more information please visit https://www.ultralytics.com.

github-actions[bot] commented 4 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

BernardinD commented 4 years ago

Any luck with this?

ShihuaiXu commented 3 years ago

Any luck with this?

sleepyheead commented 3 years ago

Any luck with this?

glenn-jocher commented 3 years ago

@sheerrrr for exporting YOLOv5 PyTorch models to TensorFlow or TFLite see https://github.com/ultralytics/yolov5/pull/1127

Dutra-Apex commented 10 months ago

You can export a torch as a .onnx file using toch.onnx.export From there you can either deploy the .onnx file, or convert it to a .pb file by using tesnorflow onnx backend: https://github.com/onnx/onnx-tensorflow

glenn-jocher commented 10 months ago

@Dutra-Apex that's correct! You can export your .pt model to ONNX format using torch.onnx.export, and then use the ONNX model in various production environments or convert it to TensorFlow's .pb format using the ONNX-TensorFlow converter. For detailed instructions on exporting to ONNX and further conversion, please refer to our documentation. If you encounter any issues or have further questions, feel free to ask here. Happy deploying! ๐Ÿ˜Š๐Ÿš€