HRNet / HRNet-Object-Detection

Object detection with multi-level representations generated from deep high-resolution representation learning (HRNetV2h). This is an official implementation for our TPAMI paper "Deep High-Resolution Representation Learning for Visual Recognition". https://arxiv.org/abs/1908.07919
Apache License 2.0
643 stars 98 forks source link

Image size! #16

Closed guaiwuguba closed 5 years ago

guaiwuguba commented 5 years ago

I use the config of cascade and hrnet! But it seems that this network occupies too many memories so that the dataset I make (2048, 1080) results in out of memory! If I crop the dataset (1333, 800), that will cause low map! So what should I do now? Please!

wondervictor commented 5 years ago

your input size (2080, 1080) is too large! you need to decrease batchsize to 1 !

guaiwuguba commented 5 years ago

Yeah! I have already set the imgs_per_gpu=1~ I want to know the max size of this config?

------------------ 原始邮件 ------------------ 发件人: "Tianheng Cheng"notifications@github.com; 发送时间: 2019年7月24日(星期三) 上午10:47 收件人: "HRNet/HRNet-Object-Detection"HRNet-Object-Detection@noreply.github.com; 抄送: "郭彤彤"1905919813@qq.com;"Author"author@noreply.github.com; 主题: Re: [HRNet/HRNet-Object-Detection] Image size! (#16)

your input size (2080, 1080) is too large! you need to decrease batchsize to 1 !

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

wondervictor commented 5 years ago

We haven't made clear the max size that one GPU can hold. Different GPU has a different memory.

guaiwuguba commented 5 years ago

Thank you very much! I have a try latter!

------------------ 原始邮件 ------------------ 发件人: "Tianheng Cheng"notifications@github.com; 发送时间: 2019年7月24日(星期三) 上午10:55 收件人: "HRNet/HRNet-Object-Detection"HRNet-Object-Detection@noreply.github.com; 抄送: "郭彤彤"1905919813@qq.com;"Author"author@noreply.github.com; 主题: Re: [HRNet/HRNet-Object-Detection] Image size! (#16)

We haven't made clear the max size that one GPU can hold. Different GPU has a different memory.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.