open-mmlab / mmagic

OpenMMLab Multimodal Advanced, Generative, and Intelligent Creation Toolbox. Unlock the magic 🪄: Generative-AI (AIGC), easy-to-use APIs, awsome model zoo, diffusion models, for text-to-image generation, image/video restoration/enhancement, etc.
https://mmagic.readthedocs.io/en/latest/
Apache License 2.0
6.96k stars 1.07k forks source link

Update `matting_tutorial.ipynb` #1179

Closed nick-konovalchuk closed 1 year ago

nick-konovalchuk commented 2 years ago

Prerequisite

🐞 Describe the bug

Hey, guys, you need to update at least demo/matting_tutorial.ipynb. It throws errors on inference with these https://github.com/open-mmlab/mmediting/issues/962 logs

Environment

Current colab versions image Tried mmcv: 1.6.2, 1.5.0, 1.4.8, 1.4.7

Additional information

No response

zengyh1900 commented 2 years ago

@Z-Fran Please check this issue and also https://github.com/open-mmlab/mmediting/issues/962

Z-Fran commented 2 years ago

Sorry for the late reply. I have succeeded to run matting_tutorial.ipynb in the environment torch 1.8.0+cu101, mmcv(mmcv-full) 1.6.2, MMEditing 0.15.2. Can you supply more information?

nick-konovalchuk commented 2 years ago

@Z-Fran What info do you need specifically? I'm running matting_tutorial.ipynb inside Google Colaboratory.

Default colab environment:

Installed with mim:

Installed from source:

Running pred_alpha = matting_inference(model, merged_path, trimap_path) * 255 got me

---------------------------------------------------------------------------

TypeError                                 Traceback (most recent call last)

[<ipython-input-10-52723053bd50>](https://localhost:8080/#) in <module>
      1 # Use the mattor to do inference
----> 2 pred_alpha = matting_inference(model, merged_path, trimap_path) * 255

4 frames

[/content/mmediting/mmedit/apis/matting_inference.py](https://localhost:8080/#) in matting_inference(model, img, trimap)
     74     # forward the model
     75     with torch.no_grad():
---> 76         result = model(test_mode=True, **data)
     77 
     78     return result['pred_alpha']

[/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py](https://localhost:8080/#) in _call_impl(self, *input, **kwargs)
    887             result = self._slow_forward(*input, **kwargs)
    888         else:
--> 889             result = self.forward(*input, **kwargs)
    890         for hook in itertools.chain(
    891                 _global_forward_hooks.values(),

[/content/mmediting/mmedit/models/mattors/base_mattor.py](https://localhost:8080/#) in forward(self, merged, trimap, meta, alpha, test_mode, **kwargs)
    261         """
    262         if test_mode:
--> 263             return self.forward_test(merged, trimap, meta, **kwargs)
    264 
    265         return self.forward_train(merged, trimap, meta, alpha, **kwargs)

[/content/mmediting/mmedit/models/mattors/indexnet.py](https://localhost:8080/#) in forward_test(self, merged, trimap, meta, save_image, save_path, iteration)
    107 
    108         pred_alpha = pred_alpha.cpu().numpy().squeeze()
--> 109         pred_alpha = self.restore_shape(pred_alpha, meta)
    110         eval_result = self.evaluate(pred_alpha, meta)
    111 

[/content/mmediting/mmedit/models/mattors/base_mattor.py](https://localhost:8080/#) in restore_shape(self, pred_alpha, meta)
    128             np.ndarray: The reshaped predicted alpha.
    129         """
--> 130         ori_trimap = meta[0]['ori_trimap'].squeeze()
    131         ori_h, ori_w = meta[0]['merged_ori_shape'][:2]
    132 

TypeError: 'DataContainer' object is not subscriptable
nick-konovalchuk commented 2 years ago

Sorry for the late reply. I have succeeded to run matting_tutorial.ipynb in the environment torch 1.8.0+cu101, mmcv(mmcv-full) 1.6.2, MMEditing 0.15.2. Can you supply more information?

I've installed your torch version to colab with pip install torch==1.8.0+cu101 torchvision==0.9.0+cu101 -f https://download.pytorch.org/whl/torch_stable.html Got in total:

Still got the same TypeError: 'DataContainer' object is not subscriptable

nick-konovalchuk commented 2 years ago

mmedit.apis.matting_inference is incompatible with later mmcv versions because of following line https://github.com/open-mmlab/mmediting/blob/14f1ffc7740aff1769a37ad093c8a13f64da9448/mmedit/apis/matting_inference.py#L71 collate returnes a dict with "meta":DataContainer Adding data['meta'] = data['meta'].data[0] solves the issue, but it seems to be a wrong and unorthodox way of fixing the issue. I'd rather say, base_mattor.py needs to be adjusted to use DataContainer

Z-Fran commented 2 years ago

It looks like some bugs in colab? Because I can run pred_alpha = matting_inference(model, merged_path, trimap_path) * 255 successfully on my PC. Returns of data = collate([data], samples_per_gpu=1) are: screenshot-20221014-174638

nick-konovalchuk commented 2 years ago

@Z-Fran I've figured it out model = init_model(config, checkpoint, device='cuda') works, but model = init_model(config, checkpoint, device='cpu') does not Could you please make the code CPU compatible or make the notebook start in colab with GPU by default? Well, I think, making it CPU compatible is the right way to go since you allow to specify device for the model