VITA-Group / CADTransformer

[CVPR 2022]"CADTransformer: Panoptic Symbol Spotting Transformer for CAD Drawings", Zhiwen Fan, Tianlong Chen, Peihao Wang, Zhangyang Wang
MIT License
68 stars 20 forks source link

CUDA out of memory even if max prim is reduced #10

Open biervat opened 1 year ago

biervat commented 1 year ago

I still get the error out of memory at around 43%. I tried to reduce args.max_prim. I tried a lot of values even as low as 1 but it doesn't seem to change anything it keeps going out of memory at the same time.

565353780 commented 1 year ago

Just edit the function in the Dataset class:

def filter_smallset(self):
    ...
    # "training -> train"
    if self.split == "train":
        ...
    else:
        # add self.max_prim >= len(target) for val and test data
        if self.max_prim >= len(target) >= self.filter_num:
        ...
avinash-218 commented 1 year ago

Image size = 100 max_prim = 1200 and the below

def filter_smallset(self):
    ...
    # "training -> train"
    if self.split == "train":
        ...
    else:
        # add self.max_prim >= len(target) for val and test data
        if self.max_prim >= len(target) >= self.filter_num:
        ...

did not rectify the CUDA error

idoglanz commented 11 months ago

I think the issue might be coming from the definition of the CADDataLoader class: def __init__(self, split='train', do_norm=True, cfg=None, max_prim=12000): where max_prim is supplied and used as a separate argument (not as part of the cfg object where it is being updated per the external arg)

Long story short, either send it separately when using the class: train_dataset = CADDataLoader(split='train', do_norm=cfg.do_norm, cfg=cfg, max_prim=cfg.max_prim) or set the self.max_prim = cfg.max_prim in line 17 in dataset.py