issues
search
rwth-i6
/
returnn_common
Common building blocks for RETURNN configs, such as models, training concepts, etc
7
stars
4
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Network differs for consecutive calls of Conformer Module.
#257
JackTemaki
opened
1 year ago
4
add nn.stft
#256
JackTemaki
closed
1 year ago
1
make Tensor feature_dim dense only
#255
albertz
closed
1 year ago
0
remove Tensor._replace_by
#254
albertz
closed
1 year ago
0
remove Tensor __copy__ and __deepcopy__
#253
albertz
closed
1 year ago
0
Move core `nn` functions to RETURNN
#252
albertz
opened
1 year ago
8
Apply black, indent by 4 spaces
#251
albertz
closed
1 year ago
2
Allow `__init__` logic to work equally for graph-based and eager-based backends, specifically re-parameterization like weight norm
#250
albertz
opened
1 year ago
4
test_cond_chunking_conformer
#249
albertz
closed
1 year ago
0
Datasets miss extern data handling and other things
#248
albertz
opened
2 years ago
5
Behavior version?
#247
albertz
opened
2 years ago
0
BlstmEncoder allow_pool_last, change default to True?
#246
albertz
opened
2 years ago
0
Conformer: Missing dropout after the self-attention
#245
albertz
opened
2 years ago
2
Weight dropout, var weight noise make checkpoint incompatible
#244
albertz
opened
2 years ago
1
CTC decoding does not really work yet
#243
albertz
closed
2 years ago
0
Register forward hook
#242
albertz
closed
2 years ago
1
Weight decay API maybe unintuitive
#241
albertz
opened
2 years ago
4
Variational parameter noise
#240
albertz
closed
2 years ago
1
ctc_loss misses logits and targets spatial dim args
#239
albertz
opened
2 years ago
0
RelPosSelfAttention _rel_shift error, learned embedding
#238
albertz
closed
2 years ago
6
Easy way to define dynamic dim
#237
albertz
closed
2 years ago
1
Dim description/naming better, easier, more intuitive
#236
albertz
opened
2 years ago
0
Better naming of different relative positional encoding schemes
#235
albertz
closed
2 years ago
2
GenericSelfAttention, biases are inconsistent to SelfAttentionLayer
#234
albertz
closed
2 years ago
9
Create good Conformer baselines
#233
albertz
opened
2 years ago
13
Direct corresponding helpers for RETURNN Datasets
#231
JackTemaki
closed
1 year ago
14
Apply black?
#230
albertz
closed
1 year ago
0
nn.slice_nd and nn.slice should be unified
#229
albertz
opened
2 years ago
1
ConformerConvSubsample configurable
#228
mmz33
closed
2 years ago
0
Getting batch dim tags fails when using `_rel_shift` function for relative pos embeddings
#227
mmz33
closed
2 years ago
2
Error when verifying output shape for relative pos emb
#226
mmz33
closed
2 years ago
3
Automatic check for RETURNN version
#225
albertz
opened
2 years ago
0
`TypeError: get_func_from_code_object() got an unexpected keyword argument 'frame'`
#224
mmz33
closed
2 years ago
4
nn.pad should output new dim tag
#223
albertz
opened
2 years ago
3
Matching Tensors still require explicit `allow_broadcast_all_sources`
#222
JackTemaki
closed
2 years ago
2
SelfAttention misses Linear after attention, wrong for Conformer, Transformer
#221
albertz
closed
2 years ago
15
Module._version support, parameter compatibility
#220
albertz
opened
2 years ago
1
Conformer frontend should fix dimensions, be more standard
#219
albertz
opened
2 years ago
9
Conv param init wrong?
#218
albertz
closed
2 years ago
1
LSTM param init wrong
#217
albertz
closed
2 years ago
1
Conformer/Transformer has same initial param value in each layer
#216
albertz
closed
2 years ago
1
remove lazy init logic, Linear, Conv, Norm API changes
#215
albertz
closed
2 years ago
2
nn.choice default length_normalization unexpected
#214
albertz
opened
2 years ago
0
python wrapper with preloaded TF
#213
albertz
closed
2 years ago
2
Lazy init causes unexpected behavior?
#212
albertz
closed
2 years ago
7
mark_as_loss: extension for LR scheduling?
#211
albertz
closed
2 years ago
1
Increase runtime performance
#210
albertz
closed
2 years ago
6
nn.concat does not return new dim
#209
albertz
closed
2 years ago
1
Change indentation to 4 spaces?
#208
albertz
closed
1 year ago
8
mark_as_loss with mandatory name arg?
#207
albertz
closed
2 years ago
2
Next