issues
search
HMUNACHI
/
nanodl
A Jax-based library for designing and training transformer models from scratch.
MIT License
271
stars
11
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
added changes
#30
HMUNACHI
closed
5 months ago
0
touched up Ijepa
#29
HMUNACHI
closed
5 months ago
0
IJEPA implementation
#28
danny-1k
closed
5 months ago
3
IJEPA Implementation
#27
danny-1k
closed
5 months ago
0
minor bug fix -> Correct Use of dropout probability as Norm epsilon
#26
danny-1k
closed
6 months ago
0
Norms have epsilon value set to dropout prob
#25
danny-1k
closed
6 months ago
0
minor bug fix -> Correct Use of dropout probability as Norm epsilon
#24
danny-1k
closed
6 months ago
1
Dev
#23
HMUNACHI
closed
6 months ago
0
Merge pull request #21 from HMUNACHI/dev
#22
HMUNACHI
closed
6 months ago
0
Tokenizer
#21
HMUNACHI
closed
6 months ago
0
Syncing
#20
HMUNACHI
closed
6 months ago
0
Dev
#19
HMUNACHI
closed
6 months ago
0
Some patches
#18
HMUNACHI
closed
6 months ago
0
Fixed random
#17
HMUNACHI
closed
6 months ago
0
patches in random
#16
HMUNACHI
closed
6 months ago
0
Jax version
#15
afrendeiro
opened
6 months ago
1
Fix instructions to install dependencies in README.md
#14
afrendeiro
closed
6 months ago
1
NanoDL 1.2.0.dev1
#13
HMUNACHI
closed
6 months ago
0
Gradient synchronization in data-parallel trainers
#12
cgarciae
opened
6 months ago
1
Cite `jax-dataloader`
#11
BirkhoffG
closed
6 months ago
1
Setup Read The Docs.
#10
HMUNACHI
closed
6 months ago
0
Create custom dropout layer which again abstracts the complicated Dropout in Flax/Jax
#9
HMUNACHI
closed
6 months ago
1
Create a custom random module which abstracts Jax's verbose process.
#8
HMUNACHI
closed
6 months ago
1
Deleted package detected
#7
ashishbijlani
closed
7 months ago
1