Well, that's a word! This PR contains a refactoring of the ResNet code which we've copied and pasted from the TF code base. Only one file has been modified.
The new ResNet code has the following properties:
Only contains code that's need in our application
ALP weights can be imported
It's easy to read in a sequential manner; the core function looks like that:
x = ResNet.first_conv(x)
x = ResNet.v2_block(x, 'block1', base_depth=64, num_units=3, stride=2)
x = ResNet.v2_block(x, 'block2', base_depth=128, num_units=4, stride=2)
x = ResNet.v2_block(x, 'block3', base_depth=256, num_units=6, stride=2)
x = ResNet.v2_block(x, 'block4', base_depth=512, num_units=3, stride=1)
x = ResNet.batch_norm(x)
return self.global_avg_pooling(x)
New layers can be added with ease (not contained in the PR, but tested)
Well, that's a word! This PR contains a refactoring of the ResNet code which we've copied and pasted from the TF code base. Only one file has been modified.
The new ResNet code has the following properties:
A few method comments are not complete yet.
Merging this PR resolves #48.