Open le-wei opened 3 years ago
I use Tesla P100 16G can run example progarm with batch size 4. if the batch size boost to 8, it also will report the error of video memory overflow.
@xjtAlgo Hello, teacher. Thank you very much for your prompt answer. According to your tips, I have successfully trained the model. But when using RTX2080 8G GPU for inference, I found a phenomenon that the GPU storage required by the model will increase as the number of detection frames increases. Much like the phenomenon of video memory leaks. When I use it in real life, if I continuously detect 30~50 frames, the memory overflow will occur. Since there is no code case for reasoning in the project library, I wrote the reasoning code according to the way the input data is processed during training and the way the output is processed during verification. It is not clear if there is a problem with my code. If you can send a code of reasoning for me to learn, I would be even more grateful. lv521lup@gmail.com.
Sorry,I'm a student like you, not a teacher.It's my fault I didn't change the picture of the account. I can't solve the information you mentioned.I think you can read this answer on data loading.It might be useful for you. https://www.zhihu.com/question/307282137/answer/1560137140
Thank you very much for your advice, I will learn about data loading.
@xjtAlgo Hello, I can get the grab frame now. I don't know how to convert the rotation angle \theta of the grab rectangle to the pose in ros. Thank you very much for helping me, a newbie who just started. Many thanks. I don't know if I can use getRotationMatrix2D to get the rotation matrix first, then turn the rotation matrix into a quaternion, and pass the quaternion to the ros related function.
Sorry, I am mainly researching the image aspect, and I haven't studied the robotic arm yet.
@xjtAlgo Thank you very much, I tried another method to solve this problem. thank you very much
Hello, I don't know how much GPU memory this model needs to run. I keep reporting the error of video memory overflow on the rtx2060 6G notebook. What GPU memory machine should I use?Thank you so much for replying.
It's not enough. Typically, if you want to run VMRN, there should be at least 11GB GPU memory. Please try 2080ti or Titan X(p) to see if it works.
Hello, I don't know how much GPU memory this model needs to run. I keep reporting the error of video memory overflow on the rtx2060 6G notebook. What GPU memory machine should I use?Thank you so much for replying.