-
Hi guys,
first of all: awesome work with Farm and Haystack. I am currently exploring the possibilities on a private project - thanks for actually providing this toolset!
My situation is the fol…
-
Hi, we can use glove embedding when building vocab, using
something like:
```
MIN_FREQ = 2
TEXT.build_vocab(train_data,
min_freq = MIN_FREQ,
vectors = "glo…
antgr updated
4 years ago
-
Looks like there's a one-off bug in the Softmax layer of LanguageModel.
It doesn't manifest unless you try to use the trained softmax layer for predicting next words.
### Cause
The bug seems t…
-
While reading through your fascinating paper, I noticed that you all do a huge amount of work initializing the input features. For example, you noted that "For the field, venue, and institute nodes, w…
-
I try to use the pre-trained ner model, and I would like to know what embeddings are used in this default ner model?
-
Hi,
What is the significance of the reproject_words argument in DocumentRNNEmbeddings and is there some recommended usage?
What affect will this reprojection have in the case of contextual embe…
-
Is it possible to use bert pretrained embeddings to train stanza NER.
-
when running the conditional sample and trying to generate a title i get the folowing error:
`tensorflow.python.framework.errors_impl.InvalidArgumentError: assertion failed: [] [Condition x
gompa updated
4 years ago
-
Hello,
I tried to generate a Language model for Arabic using Flair, but it seems not working as expected. I used the Leipzig Corpora Collection as my training corpus. It contains 1M sentences in Arab…
-
@TalSchuster I'm wondering how do you generate the pretrained model?
I have the following questions,
Were you using https://github.com/allenai/bilm-tf with the default argument?
How long did you tr…