-
My understanding is that after the batch, all dynamics embeddings are cleared from GPU memory. It makes sense as they are specific to each sentence and there is no obvious reason to keep them in memor…
-
Regarding storing of embedding, my understanding is the following:
- gpu: dynamic embeddings (Flair LM) are deleted after each batch, static embeddings are kept on GPU
- cpu: dynamic embeddings (Fla…
-
`token.get_embedding()` does not necessarily return the embeddings in deterministic order across different environments.
This is hugely problematic when trying to transfer a model to a different …
-
Hi Alan,
Feel free to deprioritize it but currently, the inference is slow on CPUs. In a [separate ticket](https://github.com/zalandoresearch/flair/issues/7), you did implement batching to improve th…
-
I am training a NER model on chemistry documents. The model does not learn any patterns.
I format my data in this way:
![image](https://user-images.githubusercontent.com/24786001/65394819-55f18000-d…
-
https://github.com/zalandoresearch/flair/blob/3c93339ff69b5d827822228f17ee075e880db195/flair/embeddings.py#L1532
I think the following is correct:
```python
self.__embedding_length = self…
-
**Describe the bug**
`embeddings_storage_mode` parameter has no effect on GPU memory usage or training speed during training of `SequenceTagger`.
**To Reproduce**
I train `SequenceTagger` using p…
-
I am trying to use Flair and `BertEmbeddings` in an offline machine. I had studied about it and I tried to download necessary resources and put it into the machine.
When I try to use `BertEmbedding…
-
**Describe the bug**
The implementation of OneHotEmbeddings for fields other than text has two issues:
* the value used in the counter is the instance of the class token, not the value of the token
…
-
Hi!
Does flair have or have anyone heard of embeddings based on syntactic dependency? What I mean is that the embedding would not check a previous word but instead the head word.
Check two of th…
ghost updated
5 years ago