issues
search
modern-fortran
/
neural-fortran
A parallel framework for deep learning
MIT License
409
stars
85
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Update for cmake use of neural-fortran
#192
mathomp4
closed
2 months ago
8
Compile with LFortran
#191
milancurcic
opened
2 months ago
0
Standard conformance fixes
#190
milancurcic
closed
2 months ago
0
Polymorphic allocatables in results of pure functions
#189
milancurcic
closed
2 months ago
0
Remove Keras HDF5
#188
milancurcic
closed
2 months ago
0
Make HDF5 optional under CMake
#187
milancurcic
closed
2 months ago
2
Make HDF5 optional
#186
milancurcic
closed
2 months ago
1
please test with flang
#185
jeffhammond
closed
2 months ago
3
Mv optimizer from the network level to the layer level
#184
jvdp1
opened
5 months ago
0
Intrinsic `pack` replaced by pointers in `get_params` and `get_gradients`
#183
jvdp1
closed
5 months ago
2
Proposition of API for the method `network % evaluate`
#182
jvdp1
closed
5 months ago
7
Replacement of a matmul + use of merge
#181
jvdp1
closed
6 months ago
4
Remove some unused variables and a broadcast
#180
jvdp1
closed
5 months ago
0
Support of a method `network % evaluate`
#179
jvdp1
opened
7 months ago
5
Addidtion of Findneural-fortran.cmake
#178
jvdp1
closed
7 months ago
1
Add topics `deep-learning`, `cnn`
#177
Beliavsky
closed
7 months ago
1
Introduce a separate `network % compile()` step
#176
milancurcic
opened
7 months ago
0
Addition of the Loss derived type and of the MSE loss function
#175
jvdp1
closed
7 months ago
5
More conv2d tests
#174
milancurcic
closed
7 months ago
0
Addition of the MSE loss function
#173
jvdp1
closed
7 months ago
4
Add instructions using Conda
#172
certik
closed
7 months ago
1
Implement locally connected layer (1-d)
#171
milancurcic
opened
8 months ago
0
Implement dropout layer
#170
milancurcic
opened
8 months ago
0
Start-to-finish example
#169
OsAmaro
opened
9 months ago
1
Supported HDF5 version
#168
aminiussi
closed
2 months ago
8
Test failure with ifx
#167
aminiussi
opened
11 months ago
6
Building client code with CMake
#166
aminiussi
closed
7 months ago
6
scaling
#165
castelao
opened
1 year ago
2
Implementing basic RNN
#162
castelao
opened
1 year ago
6
Some tests and examples fail with segmentation fault in serial production build, but not in debug build
#160
asandrock
opened
1 year ago
1
safegaurd Box-Muller normal random number generation against u=0.0
#158
dacarnazzola
closed
1 year ago
10
Added Batch Normalization Layer modules
#157
Spnetic-5
opened
1 year ago
7
Refactor `forward` and `backward` methods to allow passing a batch of data instead of one sample at a time
#156
milancurcic
opened
1 year ago
0
Implement `batchnorm` layer
#155
milancurcic
opened
1 year ago
0
Adagrad Optimizer Implementation
#154
Spnetic-5
closed
1 year ago
1
Adagrad Optimizer Implementation
#153
Spnetic-5
closed
1 year ago
0
Question about the decoupled weight decay in Adam
#152
milancurcic
closed
1 year ago
2
Initializers stub
#151
milancurcic
opened
1 year ago
0
Added Adam optimizer implementation
#150
Spnetic-5
closed
1 year ago
4
v0.13.0 release
#149
milancurcic
closed
1 year ago
5
Added Momentum and Nesterov modifications
#148
Spnetic-5
closed
1 year ago
13
Error while building in latest main
#147
Spnetic-5
closed
1 year ago
1
could it be used for subrountine in FEM softwore ABAQUS
#146
ZPLai
opened
1 year ago
3
CNN training on MNIST does not converge
#145
milancurcic
opened
1 year ago
3
Added RMSProp Optimizer subroutine
#144
Spnetic-5
closed
1 year ago
5
add CELU activation function
#143
pablomazo
closed
1 year ago
0
Connect `flatten`, `conv2d`, and `maxpool2d` layers in backward pass
#142
milancurcic
closed
1 year ago
2
Add missing update for `conv2d_layer`
#141
milancurcic
closed
1 year ago
1
Example program that shows how to access internal layer parameters
#140
milancurcic
closed
1 year ago
1
SGD optimizer stub
#139
milancurcic
closed
1 year ago
2
Next