Open Lawrence-Leung opened 7 months ago
There's something to be added that the model that I've mentioned is based on yolov8-bdd-v4-one-dropout-individual-n.yaml, which is located at directory /ultralytics/models/v8 relative to the root directory of your repository.
hi, @Lawrence-Leung , do you solve this issue? I also have this issue when I export my model to onnx
thanks
hi, @Lawrence-Leung , do you solve this issue? I also have this issue when I export my model to onnx
thanks
Hi, @JiayuanWang-JW, Thank you for your reply, but I'm so sorry that I failed to solve this issue, since my team member and I rejected using your solution into our project on a meeting in March 2024, for we think that instead of using multi-task on the level of the inference model, it's much better to put multi-task on the level of processes of operating system(i.e. Linux kernel), which is less experimental but more stable for robustness of our application. Our recent work is not a big project though, as we are just undergraduate students busy with curriculum studies, and most importantly, thriving for entrance of postgraduate studies in a better university. My team members and I are participating in a competition of electronic engineering for university students throughout China, called Phytium Cup (飞腾杯), which is a branch of a series called Integrated Circuit Innovation and Entrepreneurship Competition (集成电路创新创业大赛, where I'm not sure this translation is accurate or not). Our goal is to make a robust application on hardwares provided by Phytium company, especially the performance on AMP(asymmetric multi-processing). Though we're unable to do anything for improving your project, we still sincerely thank you for your attention to this issue. Best regards, Lawrence 2024年7月16日上午11:06,1623021453 @.> 写道: hi, @Lawrence-Leung , do you solve this issue? I also have this issue when I export my model to onnx thanks — Reply to this email directly, view it on GitHub , or unsubscribe . You are receiving this because you were mentioned. Message ID: @.>
Hello,
I've been working with your repository and followed the instructions in the README.md to set up the environment. My Python and PyTorch versions are aligned with what's specified in the README.md.
I encountered an issue while attempting to export a YOLOv8 model to ONNX format. Below is the sequence of commands I used:
However, I encountered the following issue during the export process:
Could you please help me understand what might be causing this issue and how I can resolve it to successfully export my model to ONNX format?
Thank you for your time and assistance.