issues
search
I159
/
go_deep
Neural network framework on pure Go
MIT License
1
stars
0
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Rearrange synapses initialization
#29
I159
opened
6 years ago
0
Keep activated outputs/inputs in a meta space regarding to layers
#28
I159
opened
6 years ago
0
Don't keep output of a layer as 2d matrix
#27
I159
opened
6 years ago
0
Use go_vectorize library as vector operations generalization
#26
I159
opened
6 years ago
0
Documantation
#25
I159
closed
6 years ago
0
Documantation
#24
I159
closed
6 years ago
0
Vector operations decomposition
#23
I159
closed
6 years ago
0
Check corrections application
#22
I159
closed
6 years ago
0
Don't trust incoming parameters
#21
I159
closed
6 years ago
0
Down slope doesn't lead to useful model
#20
I159
opened
6 years ago
1
Inf activation
#19
I159
closed
6 years ago
0
Optimize sigmoid
#18
I159
closed
6 years ago
1
Input-to-hidden layers synapses are too large
#17
I159
closed
6 years ago
0
Input signal overflow and underflow
#16
I159
closed
6 years ago
0
Active first
#15
I159
closed
6 years ago
0
From input to first hidden layer synapses
#14
I159
closed
6 years ago
0
Unit tests
#13
I159
opened
6 years ago
2
Separated layers
#12
I159
closed
6 years ago
0
Tendency to greater excitability of some neurons
#11
I159
closed
6 years ago
0
Epochs
#10
I159
closed
6 years ago
0
Return average cost for each batch
#9
I159
closed
6 years ago
0
Batches
#8
I159
closed
6 years ago
0
Epoches
#7
I159
closed
6 years ago
0
Ability to learn by batches of data items
#6
I159
closed
6 years ago
0
Activation function derivative
#5
I159
closed
6 years ago
0
Separate output layer and hidden layer
#4
I159
closed
6 years ago
0
layer abstraction
#3
I159
closed
7 years ago
0
Cache return values for each layer at fit mode.
#2
I159
closed
6 years ago
1
Overall weights count
#1
I159
closed
7 years ago
0