RINDNet: Edge Detection for Discontinuity in Reflectance, Illumination, Normal and Depth
Mengyang Pu, Yaping Huang, Qingji Guan and Haibin Ling
ICCV 2021 (oral)
Please refer to supplementary material (code:p86d) (~60M) for more results.
BSDS-RIND is the first public benchmark that dedicated to studying simultaneously the four edge types, namely Reflectance Edge (RE), Illumination Edge (IE), Normal Edge (NE) and Depth Edge (DE). It is created by carefully labeling images from the BSDS500. The datasets can be downloaded from:
As a fundamental building block in computer vision, edges can be categorised into four types according to the discontinuity in surface-Reflectance, Illumination, surface-Normal or Depth. While great progress has been made in detecting generic or individual types of edges, it remains under-explored to comprehensively study all four edge types together. In this paper, we propose a novel neural network solution, RINDNet, to jointly detect all four types of edges. Taking into consideration the distinct attributes of each type of edges and the relationship between them, RINDNet learns effective representations for each of them and works in three stages. In stage I, RINDNet uses a common backbone to extract features shared by all edges. Then in stage II it branches to prepare discriminative features for each edge type by the corresponding decoder. In stage III, an independent decision head for each type aggregates the features from previous stages to predict the initial results. Additionally, an attention module learns attention maps for all types to capture the underlying relations between them, and these maps are combined with initial results to generate the final edge detection results. For training and evaluation, we construct the first public benchmark, BSDS-RIND, with all four types of edges carefully annotated. In our experiments, RINDNet yields promising results in comparison with state-of-the-art methods.
Clone this repository to local
git clone https://github.com/MengyangPu/RINDNet.git
Download the augmented data to the local folder /data
run train
python train_rindnet.py
or
python train_rindnet_edge.py
more train files (trainmodelname.py and trainmodelname_edge.py) in /train_tools
Note: The imagenet pretrained vgg16 pytorch model for BDCN can be downloaded in [vgg16.pth](link: https://pan.baidu.com/s/10Tgjs7FiAYWjVyVgvEM0mA) code: ab4g. The imagenet pretrained vgg16 pytorch model for HED can be downloaded in 5stage-vgg.py36pickle code: 9po1.
Method | model | Pre-trained Model GoogleDrive |
---|---|---|
HED | modeling/hed | run/hed, code:ef18 |
CED | code | [download]() |
RCF | modeling/rcf | run/rcf, code:ef18 |
BDCN | modeling/bdcn | run/bdcn, code:ef18 |
DexiNed | modeling/dexined | run/dexined, code:ef18 |
CASENet | modeling/casenet | run/casenet, code:ef18 |
DFF | modeling/dff | run/dff, code:ef18 |
*DeepLabv3+ | modeling/deeplab | run/deeplab, code:ef18 |
*DOOBNet | modeling/doobnet | run/doobnet, code:ef18 |
*OFNet | modeling/ofnet | run/ofnet, code:ef18 |
DeepLabv3+ | modeling/deeplab2 | run/deeplab2, code:ef18 |
DOOBNet | modeling/doobnet2 | run/doobnet2, code:ef18 |
OFNet | modeling/ofnet2 | run/ofnet2, code:ef18 |
RINDNet | modeling/rindnet | run/rindnet, code:ef18 |
Download Pre-trained model for Generic Edges.
Method | model | Pre-trained Model |
---|---|---|
HED | modeling/hed_edge | run_edge/hed, code:jhsr |
CED | code | [download]() |
RCF | modeling/rcf_edge | run_edge/rcf, code:jhsr |
BDCN | modeling/bdcn_edge | run_edge/bdcn, code:jhsr |
DexiNed | modeling/dexined_edge | run_edge/dexined, code:jhsr |
CASENet | modeling/casenet_edge | run_edge/casenet, code:jhsr |
DFF | modeling/dff_edge | run_edge/dff, code:jhsr |
DeepLabv3+ | modeling/deeplab_edge | run_edge/deeplab, code:jhsr |
DOOBNet | modeling/doobnet_edge | run_edge/doobnet, code:jhsr |
OFNet | modeling/ofnet_edge | run_edge/ofnet, code:jhsr |
RINDNet | modeling/rindnet_edge | run_edge/rindnet, code:jhsr |
python evaluate.py
or
python evaluate_edge.py
The .mat format files of testing set can be download here.
cd eval
run eval.m
Method | model | Reflectance F-measure | Illumination F-measure | Normal F-measure | Depth F-measure | Average F-measure |
---|
Method | model | ODS | OIS | AP | ODS | OIS | AP | ODS | OIS | AP | ODS | OIS | AP | ODS | OIS | AP |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HED | model | 0.412 | 0.466 | 0.343 | 0.256 | 0.290 | 0.167 | 0.457 | 0.505 | 0.395 | 0.644 | 0.679 | 0.667 | 0.442 | 0.485 | 0.393 |
CED | - | 0.429 | 0.473 | 0.361 | 0.228 | 0.286 | 0.118 | 0.463 | 0.501 | 0.372 | 0.626 | 0.655 | 0.620 | 0.437 | 0.479 | 0.368 |
RCF | model | 0.429 | 0.448 | 0.351 | 0.257 | 0.283 | 0.173 | 0.444 | 0.503 | 0.362 | 0.648 | 0.679 | 0.659 | 0.445 | 0.478 | 0.386 |
BDCN | model | 0.358 | 0.458 | 0.252 | 0.151 | 0.219 | 0.078 | 0.427 | 0.484 | 0.334 | 0.628 | 0.661 | 0.581 | 0.391 | 0.456 | 0.311 |
DexiNed | model | 0.402 | 0.454 | 0.315 | 0.157 | 0.199 | 0.082 | 0.444 | 0.486 | 0.364 | 0.637 | 0.673 | 0.645 | 0.410 | 0.453 | 0.352 |
CASENet | model | 0.384 | 0.439 | 0.275 | 0.230 | 0.273 | 0.119 | 0.434 | 0.477 | 0.327 | 0.621 | 0.651 | 0.574 | 0.417 | 0.460 | 0.324 |
DFF | model | 0.447 | 0.495 | 0.324 | 0.290 | 0.337 | 0.151 | 0.479 | 0.512 | 0.352 | 0.674 | 0.699 | 0.626 | 0.473 | 0.511 | 0.363 |
*DeepLabv3+ | model | 0.297 | 0.338 | 0.165 | 0.103 | 0.150 | 0.049 | 0.366 | 0.398 | 0.232 | 0.535 | 0.579 | 0.449 | 0.325 | 0.366 | 0.224 |
*DOOBNet | model | 0.431 | 0.489 | 0.370 | 0.143 | 0.210 | 0.069 | 0.442 | 0.490 | 0.339 | 0.658 | 0.689 | 0.662 | 0.419 | 0.470 | 0.360 |
*OFNet | model | 0.446 | 0.483 | 0.375 | 0.147 | 0.207 | 0.071 | 0.439 | 0.478 | 0.325 | 0.656 | 0.683 | 0.668 | 0.422 | 0.463 | 0.360 |
DeepLabv3+ | model | 0.444 | 0.487 | 0.356 | 0.241 | 0.291 | 0.148 | 0.456 | 0.495 | 0.368 | 0.644 | 0.671 | 0.617 | 0.446 | 0.486 | 0.372 |
DOOBNet | model | 0.446 | 0.503 | 0.355 | 0.228 | 0.272 | 0.132 | 0.465 | 0.499 | 0.373 | 0.661 | 0.691 | 0.643 | 0.450 | 0.491 | 0.376 |
OFNet | model | 0.437 | 0.483 | 0.351 | 0.247 | 0.277 | 0.150 | 0.468 | 0.498 | 0.382 | 0.661 | 0.687 | 0.637 | 0.453 | 0.486 | 0.380 |
RINDNet | model | 0.478 | 0.521 | 0.414 | 0.280 | 0.337 | 0.168 | 0.489 | 0.522 | 0.440 | 0.697 | 0.724 | 0.705 | 0.486 | 0.526 | 0.432 |
We have released the code and data for plotting the edge PR curves of the above edge detectors here.
If you want to compare your method with RINDNet and other methods, you can download the precomputed results here (code: ewco).
@InProceedings{Pu_2021ICCV_RINDNet,
author = {Pu, Mengyang and Huang, Yaping and Guan, Qingji and Ling, Haibin},
title = {RINDNet: Edge Detection for Discontinuity in Reflectance, Illumination, Normal and Depth},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2021},
pages = {6879-6888}
}