-
Based on BERT documentation (https://github.com/google-research/bert#using-bert-to-extract-fixed-feature-vectors-like-elmo) we can extract the contextualized token embeddings of each hidden layer sepa…
-
Hi Matthew,
Thanks a bunch for the documentation on embedding sentences programmatically. It saves me a lot of time! I did a little bit of modification so that I can use KnowBert to predict the mis…
-
Is there a tutorial or any sample code how to use `LanguageModel`?
I am trying to write a basic GRU-based `LanguageModel` and running into a huge inconvenience.
```
gru = GRU(config['embedding_…
-
Hi @lucidrains
I'm currently testing the `generate` function of the `TrainingWrapper` class.
When I use DeepSpeed and I try to generate a sequence it gives me the following error:
```AttributeE…
-
From the following code, I'm not sure if the glove embedding is going to be updated or just simply stay as it is.
```
from flair.embeddings import WordEmbeddings, FlairEmbeddings, StackedEmbeddings…
-
When trying your example
```
from danlp.models.ner_taggers import load_ner_tagger_with_flair
from flair.data import Sentence
# Load the NER tagger using the DaNLP wrapper
flair_model = load_…
-
Is it possible to train anchored LM for few-shot parsing?
-
I am trying to use your code for a new dataset, having a different set of entities. Since the entities are different from what CONLL-2003 and Ontonotes datasets provide, I need to train the Fasttext e…
-
In your code there are multiple references to ELMo, indicating that you experimented with contextual embeddings.
Can you share any of your results using ELMo embeddings?
I am currently getting f1…
-
比如bert embedding 是768 维
glove 是300维
然后想问一下你是怎么对齐的呀
谢谢~