issues
search
RaleighZ
/
statnlp_fundamental_reading
Group for Fundamental NLP Reading and Learning
5
stars
0
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Key Concepts in GNN
#23
Cartus
opened
5 years ago
0
Variants of GNN
#22
Cartus
opened
5 years ago
0
LRP(long-range dependency) evaluation between CNN, RNN and TF
#21
Cartus
opened
5 years ago
0
RNNs Variants
#20
Cartus
opened
5 years ago
0
Structure RNN
#19
Cartus
opened
5 years ago
0
Supplementary for gradient vanishing
#18
RaleighZ
opened
5 years ago
0
Residue Connection
#17
RaleighZ
opened
5 years ago
0
What is the meaning of GLU layer?
#16
xuuuluuu
opened
5 years ago
0
CNN ARCH
#15
RaleighZ
closed
5 years ago
0
1×1 convolutional layer
#14
Cartus
closed
5 years ago
2
Sin cannot serve as non-linear funtion in NN?
#13
RaleighZ
opened
5 years ago
0
Details of pooling in CNN for NLP
#12
RaleighZ
opened
5 years ago
2
Other Resources of CNNs
#11
Cartus
closed
5 years ago
0
Structured Convolution
#10
Cartus
opened
5 years ago
2
Dilated Convolution related papers
#9
Cartus
opened
5 years ago
0
What is the loss of BERT and is it a muti-task training task?
#8
nanguoshun
closed
5 years ago
2
Details of negative sampling in skip-gram
#7
RaleighZ
closed
5 years ago
1
word-embedding: Reading suggestion
#6
RaleighZ
opened
5 years ago
0
Parameters tied in ELMo model
#5
nanguoshun
closed
5 years ago
2
what is the relation between homogeneous Markov and n-gram LM
#4
nanguoshun
opened
5 years ago
1
What is the definition of entropy rate and what is the relation to perplexity?
#3
nanguoshun
opened
5 years ago
2
why cross entropy equals to perplexity?
#2
nanguoshun
closed
5 years ago
2
Language
#1
Cartus
closed
5 years ago
1