wanggrun / Adaptively-Connected-Neural-Networks

A re-implementation of our CVPR 2019 paper "Adaptively Connected Neural Networks"
145 stars 29 forks source link

Handling dynamic image sizes #2

Closed shivamsaboo17 closed 5 years ago

shivamsaboo17 commented 5 years ago

I wanted to make this work with dynamic image sizes similar to convolution layers in most of deep learning frameworks. But as the weights for each type of operation (alpha, beta and gamma in paper) work on pixel/neuron level, the size has to be defined before run-time. Can there be some approximation which could handle this well?

wanggrun commented 5 years ago

I wanted to make this work with dynamic image sizes similar to convolution layers in most of deep learning frameworks. But as the weights for each type of operation (alpha, beta and gamma in paper) work on pixel/neuron level, the size has to be defined before run-time. Can there be some approximation which could handle this well?

Thanks!

(1) For the pixel-aware ACNet, the shape of alpha, beta and gamma can be natually consistent with dynamic image size.

(2) For the dataset-aware ACNet, you may i) down-sample the feature map to a fixed shape, then ii) perform adaptive connection, then iii) up-sample the feature map to the original shape.

shivamsaboo17 commented 5 years ago

I wanted to make this work with dynamic image sizes similar to convolution layers in most of deep learning frameworks. But as the weights for each type of operation (alpha, beta and gamma in paper) work on pixel/neuron level, the size has to be defined before run-time. Can there be some approximation which could handle this well?

Thanks!

(1) For the pixel-aware ACNet, the shape of alpha, beta and gamma can be natually consistent with dynamic image size.

(2) For the dataset-aware ACNet, you may i) down-sample the feature map to a fixed shape, then ii) perform adaptive connection, then iii) up-sample the feature map to the original shape.

For dataset-aware I tried a similar approach ... but instead of features I interpolated weights. Will try it other way round.