Open Spnetic-5 opened 1 year ago
Thank you @Spnetic-5, great. Yes, this is certainly in the correct direction. I haven't yet reviewed the forward and backward algorithms, let me know when it's ready for review.
Also, despite #156, we should be able to test this layer implementation on its own, without integrating it with the network. In other words, while #156 will be necessary for full integration, we can work on this implementation as a standalone layer before #156.
@milancurcic Please review the forward and backward pass implementations I've added based on my interpretation of the paper, also could you guide me on how we can test this layer?
Thanks! Let's review it briefly on the call today.
We can test this layer independently by passing some small, known input, and comparing the result with the corresponding known output. This should be straightforward since the batchnorm operation is relatively simple (just normalization of data). The backward is the inverse operation, so as I understand it, we can pass the same expected output to recover the same expected input.
@Spnetic-5 I just saw your message on Discourse, no problem; we'll proceed work on this PR as usual.
See for example a program that tests forward and backward passes of the maxpool2d layer using known inputs and expected outputs:
https://github.com/modern-fortran/neural-fortran/blob/main/test/test_maxpool2d_layer.f90
We'd use the same approach to test a batchnorm layer.
Hello, @milancurcic. Sorry for the lack of activity over the past few days; this was my final week of internship in Canada, and I'll be returning to India on Monday.
I added a test module for the batch norm layer, however it has some error; I believe I will need your assistance on this.
No worries at all, thanks for all the work. I'll review it tomorrow.
Addresses #155
@milancurcic , I've included the structure of the batch normalization layer. Could you please review it and confirm whether I'm in the correct direction?