-
This work is a very impressive, but i have some question about your code.
In the line 158 - 171 of src/dataset/dataset.py
Can these lines of code be transformed into formula below
![image](https…
-
Hi,
Thank you for uploading your code and awesome work to Git.
I have downloaded the ACE'05 dataset and would like to generate the code representation for it. Following your suggestions, I ran the…
-
Hey guys, great work! Thank you for publishing the paper. Very impressed with your results, especially for 250M and 780M models - they look super cool!
I've got several questions:
1. Am I right,…
-
Hi,
Congrats for the great work.
I have a question about evaluation for OOD benchmark. Are you including "else" label for the evaluation?
I have recently notice that UniversalNER remove them, but…
-
Hi,
does the Span F1 score in your evaluation script consider the span index similar to https://github.com/chakki-works/seqeval?
'spaceX' vs ('spaceX', 0, 1)
If not, how should I compare the CoN…
-
It's mentioned in the documentation but not in the source code, if I've understood correctly.
-
If i want to change the base model to something else for fine tuning, what should I be aware of and modify? i see the codebase has a flash attention modeling code for llama2. i’m just curious if it wo…
-
The address for the preprocessing code of ACE05 (Preprocessing script) seems to be invalid. Could you provide it again, please?
-
**Describe the task**
1. Model: I was testing GoLLIE-7B with `create custom task.ipynb` file
2. Task: create custom task
**Describe the bug**
I set `use_flash_attention=False` in
```Python
mo…
-
#1 Respect
#6 What's Going On
#12 Superstition
#26 A Case of You
#36 Seven Nation Army
#43 My Girl
#44 Billie Jean
#50 The Tracks of My Tears
#55 Like A Prayer
#78 Reach Out (I’ll Be There)
#84 Let's …