issues
search
ziyin-dl
/
word-embedding-dimensionality-selection
On the Dimensionality of Word Embedding
https://nips.cc/Conferences/2018/Schedule?showEvent=12567
MIT License
329
stars
44
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
How can the PIP loss function be transferred to the graph embedding? I want to use this loss function to find the optimal dimension in the graph embedding.
#22
huabao97
opened
3 years ago
0
word2vec CBOW
#21
ronhab
opened
4 years ago
1
fixed Win shell dependency
#20
ziyin-dl
closed
4 years ago
0
Error when executing in cmd
#19
bohyunshin
opened
4 years ago
4
core dumped error
#18
shenxuhui
opened
5 years ago
6
Noise estimation in signal_matrix.py
#17
JamesTuna
opened
5 years ago
13
Spectral Estimation
#16
asadullah797
closed
5 years ago
3
Will this work on fasttext embeddings ?
#15
Priyansh2
opened
5 years ago
3
code problem
#14
SunYanCN
closed
5 years ago
1
remove line break in the tokenizer
#13
ziyin-dl
closed
5 years ago
0
about corpus format
#12
OYE93
opened
5 years ago
4
does it use too much memory ?
#11
ahhygx
opened
5 years ago
12
Create LICENSE
#10
ziyin-dl
closed
5 years ago
0
refactored config files
#9
ziyin-dl
closed
5 years ago
0
use min_count to control word frequency threshold
#8
ziyin-dl
closed
5 years ago
0
Tokenizer code ignores vocabulary size parameter from the config file
#7
shudima
closed
5 years ago
1
Use vocabulary size config parameter
#6
shudima
closed
5 years ago
0
python2&3 compatibility
#5
ziyin-dl
closed
5 years ago
0
Add Arxiv link in readme to paper
#4
impredicative
closed
5 years ago
1
Stop using Python 2
#3
impredicative
closed
5 years ago
1
removed unused imports and added requirements.txt
#2
ziyin-dl
closed
6 years ago
0
added implementation for ppmi lsa
#1
ziyin-dl
closed
6 years ago
0