-
## When
- 11.15
## Who :
- 권세중 (sejung.kwon@navercorp.com)
## What
- Title: AlphaTuning: Quantization-Aware Parameter-Efficient Adaptation of Large-Scale Pre-Trained Language Models
- Links: ht…
-
# URL
- https://arxiv.org/abs/2405.05904
# Affiliations
- Zorik Gekhman, N/A
- Gal Yona, N/A
- Roee Aharoni, N/A
- Matan Eyal, N/A
- Amir Feder, N/A
- Roi Reichart, N/A
- Jonathan Herzig…
-
When will the source code be open
-
Any time estimate as to when you'll be able to release the code?
-
Hi ,
I am facing the following error when trying to run the model . i am using the standard encodings.
i could not understand why is computation failing for the encoding IOEBS. is ther a way t…
-
# URL
- https://arxiv.org/abs/2012.14913
# Affiliations
- Mor Geva, N/A
- Roei Schuster, N/A
- Jonathan Berant, N/A
- Omer Levy, N/A
# Abstract
- Feed-forward layers constitute two-thirds of…
-
-
[論文リンク](http://www.aclweb.org/anthology/D15-1042)
-
# URL
- https://arxiv.org/abs/2310.15916
# Affiliations
- Roee Hendel, N/A
- Mor Geva, N/A
- Amir Globerson, N/A
# Abstract
- In-context learning (ICL) in Large Language Models (LLMs) has emer…
-
中国人工智能学会CAAI发布了一个推荐目录,作者最近有没有时间,以及有没有意愿整一个CAAI的版本。因为感觉CCF的AI推荐列表很偏向CV(CVPR、ICCV、TPAMI、IJCV、TIP、TOG、TVCG都是CV的A),对很多NLPer太不友好了(A类仅ACL一个会议,也没有A类期刊)。CAAI把EMNLP和TASLP调整为A,太好了。