Open l99500 opened 5 years ago
you can check the code of deeplab v3+. https://github.com/rishizek/tensorflow-deeplab-v3-plus/blob/master/utils/preprocessing.py#L115 That function will return re-scaled image with the scale from min_scale to max_scale.
Moreover, if you get
end_points['pool3']
, it means you moved layers from after pool3 to the last layer.
you can check the code of deeplab v3+. https://github.com/rishizek/tensorflow-deeplab-v3-plus/blob/master/utils/preprocessing.py#L115 That function will return re-scaled image with the scale from min_scale to max_scale.
Moreover, if you get
end_points['pool3']
, it means you moved layers from after pool3 to the last layer.
thank you a lot. I will take your reply seriously. If i achieve this function, i would share with you.
@kemangjaka I find that in the code the author use the up-sample function by double. Just like 'conv_low_up = Upsampling(conv_low,2)', If the size of feature maps cannot be divided by 2, during summation period there would be unmatched error. But this is my guess, I don't do experiment, so I am not sure it's validity. Recently, I am writing my graduate paper. I maybe verify it after some days.
Dear GeorgeSeif, thanks for your share firstly. I find some problems in this project. I trained the RefineNet model using the resnet101 several days ago. Your project's input image have fixed size-512 during the train and test period. But RefineNet's matconvnet implement have random scaling. I notice that the origin RefineNet code remove the several last layer in resnet101 . Is it possible to do the same thing in tensorflow, or do you know how to achieve the random scaling of input images.