-
I am getting this error while running `train_word_embeddings.py`.
![screenshot 2019-02-03 at 8 30 06 pm](https://user-images.githubusercontent.com/25135893/52178288-8ba62080-27f2-11e9-8226-6b77bb84…
-
Not familiar with Torch, it seems not like "SG-Retrofit" ?
-
RuntimeError: Error(s) in loading state_dict for BertModel:
size mismatch for bert.embeddings.word_embeddings.weight: copying a param with shape torch.Size([21128, 128]) from checkpoint, the shape i…
-
I met this error when trying to run the code (only change Glove to 840B.300d but remain filename as 6B.300d)
Does anybody know how to fix this?
File "scripts/run_model/run_bimpm.py", line 267, in …
-
We currently have ranking feature:
(skip[12]?0:(*vit)->cosine_rank);
This is based on the top 20 semantic nearest neighbours as returned on the basis of word2vec word embeddings and a further c…
-
## 0. Paper
@inproceedings{neelakantan-etal-2014-efficient,
title = "Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space",
author = "Neelakantan, Arvind and…
a1da4 updated
3 years ago
-
Hi, I wonder if it is possible to modify the code, to get embeddings at word level instead of at the sentence level.
-
When chain models are named anything else, it works fine. But if I want to use a chain model as the main representation, it will produce an error.
```python
representation_model = {
"Main": […
-
### Your current environment
The output of `python collect_env.py`
```text
Collecting environment information...
WARNING 10-29 04:15:30 _custom_ops.py:19] Failed to import from vllm._C with …
-
Issue:
We currently depend on vocabularies, like glove embeddings, that are:
1. Weirdly biased (although when you backprop to the embeddings, their initial bias is not very relevant anymore),
2.…