-
#### Issue description
Missing demo and industry papers for EMNLP 2022.
See https://preview.aclanthology.org/emnlp-22-ingestion/events/emnlp-2022/.
![image](https://user-images.githubusercont…
-
Hi! Thanks for your work. A small question here.
For a multilingual language model, why can it be guaranteed that words with the same meaning in different languages have similar representations in t…
-
# Creating an Evaluation Harness for code LLMs
We are working on an Evaluation Harness that covers a wide array of coding tasks and programming languages. We'd appreciate your input!
### Existing lis…
-
The video linked to this paper is not the correct one. https://aclanthology.org/2021.findings-emnlp.64.mp4 appears to actually be paper 2021.findings-emnlp.210.
-
We should add a very clear watermark on Anthology previews to prevent this kind of mistake.
@cdmalon this pr is still marked as a draft and, to quote @xinru1414:
> We are, in fact, still waiting on…
-
Thank you for publishing this tool. It has been really helpful to me!
I am also interested in the QA system mentioned in your paper, "Beyond NED: Fast and Effective Search Space Reduction for Compl…
-
Hello, I read your paper with title "Code Generation From Flowcharts with Texts: A Benchmark Dataset and An Approach".
Could/Do you provide the implementation (like pytorch or tensorflow) of your m…
-
-
Hi! The paper: "Mask and Reason: Pre-Training Knowledge Graph Transformers for Complex Logical Queries" was published as a KDD'22 paper, which proposes 'Kgtransformer'.
It seems that Triple2seq imp…
-
## Environment info
- `adapter-transformers` version: 3.1.0
- Platform: Linux-3.10.0-1062.12.1.el7.x86_64-x86_64-with-glibc2.27
- Python version: 3.9.12
- Huggingface_hub version: 0.7.0
- PyTo…