Open justintiger opened 3 years ago
Please replace the graphics card with more memory
Please replace the graphics card with more memory Thank so much. What about this error?
VideoTo3dPoseAndBvh/pose_trackers/lighttrack$ python demo_video_mobile.py
Traceback (most recent call last):
File "demo_video_mobile.py", line 12, in
Do you know whats the problem? $ python videopose.py the video is 25.051 f/s Loading YOLO model.. outputs/inputvideo/kunkun_cut_one_second.mp4 --- elapsed time: 1.0411371119844262 s Traceback (most recent call last): File "videopose.py", line 332, in
inference_video('outputs/inputvideo/kunkun_cut_one_second.mp4', 'alpha_pose')
File "videopose.py", line 195, in inference_video
main(args)
File "videopose.py", line 74, in main
keypoints = detector_2d(video_name)
File "/mnt/sdb1/pose_estimation/3D_pose_estimation/VideoTo3dPoseAndBvh/joints_detectors/Alphapose/gene_npz.py", line 36, in generate_kpts
final_result, video_name = handle_video(video_file)
File "/mnt/sdb1/pose_estimation/3D_pose_estimation/VideoTo3dPoseAndBvh/joints_detectors/Alphapose/gene_npz.py", line 122, in handle_video
det_loader = DetectionLoader(data_loader, batchSize=args.detbatch).start()
File "/mnt/sdb1/pose_estimation/3D_pose_estimation/VideoTo3dPoseAndBvh/joints_detectors/Alphapose/dataloader.py", line 280, in init
self.det_model.cuda()
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 307, in cuda
return self._apply(lambda t: t.cuda(device))
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 203, in _apply
module._apply(fn)
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 203, in _apply
module._apply(fn)
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 203, in _apply
module._apply(fn)
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 225, in _apply
param_applied = fn(param)
File "/home/ihomelab/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 307, in
return self._apply(lambda t: t.cuda(device))
RuntimeError: CUDA error: out of memory