-
Hello,
I'm training a retrieval model on a lot of data (>800 000 interactions) and a lot of unique items (> 300 000) using precomputed embeddings and contextual data as input.
Because of that la…
-
Hi authors,
I notice that
> **Global Entity Disambiguation with Pretrained Contextualized Embeddings of Words and Entities**
> https://arxiv.org/abs/1909.00426
is also your work, and that one i…
-
> library(text)
This is text (version 0.9.16).
Text is new and still rapidly improving.
Newer versions may have improved functions and updated defaults to reflect current understandings of the st…
-
Dear Team, is it possible to train your aspect based sentiment models to detect aspects for different industries and more aspect classes? Currently, the model is only detecting a single aspect from ce…
-
https://arxiv.org/pdf/1810.04805.pdf
温故知新...
-
## I believe we need graphics
One of the main goals of the website is to express high-level ideas very clearly to make people understand the content easily (https://machinetranslate.org/style). I bel…
-
Hello,
I am using Flair to produce contextual embeddings, which are fed into a Bi-LSTM-CRF model for sequence labeling task. Are the parameters of Flair are fixed during the training? or they are upd…
-
Hi,
I am reading the paper "Dissecting Contextual Word Embeddings: Architecture and Representation" by Allen Institute for AI, where the authors test a 4-layer ELMo variant. But I cannot find the w…
-
Hi,
thank you for your amazing work. I have a question about using ProtBert-BFD Embeddings. As far as I understand, the embeddings themselves are not publicly available, right? and the only way to u…
-
Hi @kwang2049 and @nreimers
I want to train pretrain a sentence transformer using TSDAE.
From my understanding I can either use a language model checkpoint (like `bert-base-uncased` [as it is do…