-
when i run the scrip_eval.py with the the fine-tune model generated by the script_training.sh, it has this error:
```
Traceback (most recent call last):
File "/Volumes/Files/Documents/PycharmProj…
-
With the decoupling of encoders and decoders, we have added a `Linear` encoder, which seems to just embed the inputs and pass them along. We should also add a `SelfAttention` encoder, which encodes th…
-
Right now our only approach is a fairly simple neural net with word embeddings. The performance might increase if we apply a transformer-based architecture, contextualized embeddings, model fine-tunin…
-
Hello, thank you for your work and the code. I am trying to understand how the scFoundation embeddings were used within the GEARS framework. In the paper, you mention:
> (...) In our method, we obt…
-
A beautiful work by you. Hope to see similar work for other types of embeddings like contextual word embeddings.
Will this work with fastext ? If no, what files I have to edit. Also, can you shed so…
-
## 0. Paper
paper: [arxiv](https://arxiv.org/abs/1906.02715)
## 1. What is it?
They analyze contextualized word representations from BERT.
## 2. What is amazing compared to previous works?…
-
### Project Name
LLM-based cyber security news summarizer and chatbot
### Description
This app is a cybersecurity-focused news summarizer and chatbot designed for Security Operation Centers (SOCs).…
-
Hi,
I finally managed to use `get_sequence_output` to get word embeddings after dealing with random embeddings due to dropout, random seed, etc.
However, `get_sequence_output()` doesn't seem to …
-
Hello. Thank you for publishing your research. While reading your SeqMix paper, I had trouble understanding the method you used to build your {word, embedding} table (as discussed in Section 3.2 and A…
-
- [x] Hu, R., Li, S., & Liang, S. (2019). Diachronic Sense Modeling with Deep Contextualized Word Embeddings: An Ecological View, 3899–3908. https://doi.org/10.18653/v1/p19-1379
- **Barbara 10/11**
…