-
Hi,
I'm wondering how did you compute the BLEU score of your paper? Did you take one generated distractor as input and the three actual distractors as golden answers?
-
Hi
Thank you very much for your implementation. It's been very helpful to me.
Regarding your example script to generate questions and answers (google collab), I would like you to clarify some do…
-
Hi,
I trained a baseline CNN model on the 4 provided pathfinder datasets (32x32, 64x64, 128x128, 256x256) and it achieved good results on 32x32, 64x64, and 256x256, but only random guessing on 128x…
-
Hi, I'm following your wonderful work on distractor generation. May I know how did you preprocess the RACE dataset for the BART model ? In the instruction from README file, you mentioned the `race_tra…
-
@voidful Hi there,
I've tried to implement multiple distractors using the pretrained models post on Huggingface, but I'm still unable to get multiple distractors.
Here is my code:
```python
fr…
-
I'm trying to generate the bop-like linemod dataset. Using the config_lm_upright.yaml configuration file, it seems that blenderproc requires the tless dataset to be present
```
FileNotFoundError: […
-
Hello,
I am opening an issue here since I cannot figure out why DOPE is not learning on my side. I'll provide here as much information as possible about my system and the steps I did to launch the …
-
- Gitea version (or commit ref): 1.14.x
- Git version: 2.31.1
- Operating system: FreeBSD 13
- Gitea built using ports collection (www/gitea)
- Gitea started by startup script provided by www/…
tsowa updated
3 years ago
-
Hi,
Not sure if this is a bug or perhaps a misimplementation, but i'm comparing the results of using gpt2-large on the 'Write With Transformers' Text Generation example - https://transformer.huggi…
-
Hello, I'm looking for some project to see what can be done in this topic. Do you have the trained distractor generation model? It could be interesting in order to try a demo, or a colab notebook with…