tsing90 / pytorch_semantic_human_matting

This is an unofficial implementation of the paper "Semantic human matting":
https://arxiv.org/pdf/1809.01354.pdf
83 stars 18 forks source link

loading data for e2e training #19

Open xinggangw opened 5 years ago

xinggangw commented 5 years ago

Hey,

Great work! I found there is a bug when loading data for end-to-end training.

In train.py, it requires data in the following format.

img, trimap_gt, alpha_gt, bg, fg = sample_batched['image'], sample_batched['trimap'], sample_batched['alpha'], sample_batched['bg'], sample_batched['fg']

However, in dataset.py, you return the data as follows.

return (img_m, trimap_m), (img, trimap, a, bg, fg)

I think we should return the second part. Is it right? Thanks,

lan2720 commented 5 years ago

the same question...

lan2720 commented 5 years ago

Hi, @tsing90 How to train the model end2end? The code cannot run successfully.

tobechao commented 5 years ago

@xinggangw In dataset.py end-to-end phase: i try: return (img_m, trimap_m), (img_m, trimap_m, a_m, bg_m, fg_m), i can train SHM end-to-end, but t-net&m-net performance is bad

Tomhouxin commented 5 years ago

@tobechao 你训练出end-to-end,有遇到如下图这样的错吗? TIM截图20190927163104

还有你说的性能问题是指很慢以及耗资源吗?