cherubicXN / hawp

Holistically-Attracted Wireframe Parsing [TPAMI'23] & [CVPR' 20]
MIT License
291 stars 51 forks source link

Negative samples from the static line sampler #6

Closed alwc closed 4 years ago

alwc commented 4 years ago

Hi @cherubicXN

I'm trying to pre-process the dataset from scratch using the train.json file from https://github.com/cherubicXN/afm_cvpr2019 and some modified code from https://github.com/zhou13/lcnn

I can reproduce edges_positive and juctions correctly, but I can't seem to exactly match the edges_negative numbers found in your provided json file.

Here is my code modified from L-CNN's repo:

afm_train_json = data_dir / 'train.json'

with afm_train_json.open('r') as jf:
    dataset = json.load(jf)

data = dataset[1]
image = cv2.imread(str(afm_im_dir / data["filename"]))
lines = np.array(data["lines"], dtype=np.float64).reshape(-1, 2, 2)

def to_int(x):
    return tuple(map(int, x))

im_scale = (image.shape[1], image.shape[0])
lmap = np.zeros(im_scale, dtype=np.float32)

lines[:, :, 0] = np.clip(lines[:, :, 0], 0, im_scale[0] - 1e-4)
lines[:, :, 1] = np.clip(lines[:, :, 1], 0, im_scale[1] - 1e-4)

junc = []
jids = {}

def jid(jun):
    jun = tuple(jun[:2])
    if jun in jids:
        return jids[jun]
    jids[jun] = len(junc)
    junc.append(np.array(jun))
    return len(junc) - 1

lnid = []
lpos = []
for v0, v1 in lines:
    lnid.append((jid(v0), jid(v1)))
    lpos.append([junc[jid(v0)], junc[jid(v1)]])
    rr, cc, value = skimage.draw.line_aa(*to_int(v0), *to_int(v1))
    lmap[rr, cc] = np.maximum(lmap[rr, cc], value)

# Read LCNN, Section "3.5. Line Sampling Module"
lneg = []
divisor = 2
llmap = zoom(lmap, [1 / divisor, 1 / divisor])
lineset = set([frozenset(l) for l in lnid])
for i0, i1 in combinations(range(len(junc)), 2):
    if frozenset([i0, i1]) not in lineset:
        v0, v1 = junc[i0], junc[i1]
        vint0, _vint1 = to_int(v0[:2] / divisor), to_int(v1[:2] / divisor)
        vint0 = (
        np.clip(_vint0[0], 0, llmap.shape[0] - 1),
        np.clip(_vint0[1], 0, llmap.shape[1] - 1),
    )
    vint1 = (
        np.clip(_vint1[0], 0, llmap.shape[0] - 1),
        np.clip(_vint1[1], 0, llmap.shape[1] - 1),
    )        
        rr, cc, value = skimage.draw.line_aa(*vint0, *vint1)
        lneg.append([v0, v1, i0, i1, np.average(np.minimum(value, llmap[rr, cc]))])

assert len(lneg) != 0
lneg.sort(key=lambda l: -l[-1]) 

Lneg = np.array([l[2:4] for l in lneg][:4000], dtype=np.int)
cherubicXN commented 4 years ago

Actually the negative examples are got from the L-CNN. I just copy the npy files from the L-CNN and integrate them info the json file.

cherubicXN commented 4 years ago

Let me check the old file tomorrow.

cherubicXN commented 4 years ago

Hi @alwc, I have checked the edges_negative with the L-CNN's ***_0_label.npz. The length of negative edges and the values are all the same. I did not run the L-CNN's script to generate negative examples. Did you try to compare the JSON file with the L-CNN's preprocessed data?

alwc commented 4 years ago

Hi @alwc, I have checked the edges_negative with the L-CNN's ***_0_label.npz. The length of negative edges and the values are all the same. I did not run the L-CNN's script to generate negative examples. Did you try to compare the JSON file with the L-CNN's preprocessed data?

Hi @cherubicXN , I checked the edges_negative in L-CNN's ***_0_label.npz and HAWP's JSON file and they matched like what you've said.

An additional note: I tried to use my modified code above to generate the edges_negative label and trained a new model with it. I ended up with sAP10.0 = 66.6 at epoch 27 and 28.

This could be due to 1) randomness 2) I got my negative samples from the (image_w / 2 × image_h / 2) low-resolution bitmap instead of "64 × 64 low-resolution bitmap"

cherubicXN commented 4 years ago

Hi @alwc, I have checked the edges_negative with the L-CNN's ***_0_label.npz. The length of negative edges and the values are all the same. I did not run the L-CNN's script to generate negative examples. Did you try to compare the JSON file with the L-CNN's preprocessed data?

Hi @cherubicXN , I checked the edges_negative in L-CNN's ***_0_label.npz and HAWP's JSON file and they matched like what you've said.

An additional note: I tried to use my modified code above to generate the edges_negative label and trained a new model with it. I ended up with sAP10.0 = 66.6 at epoch 27 and 28.

This could be due to

  1. randomness
  2. I got my negative samples from the (image_w / 2 × image_h / 2) low-resolution bitmap instead of "64 × 64 low-resolution bitmap"

That's awesome! Let me try it as you said.