-
Hi,
thanks for making the work available and for the explanations.
From the paper I understand that a training instance is a dialogue session, made up of several dialogue turns concatenated and…
-
I have the following Torchserve handler and dockerfile, but I’m getting prediction failed:
`from ts.torch_handler.base_handler import BaseHandler
from transformers import AutoModelWithLMHead, Auto…
-
- https://arxiv.org/abs/2104.08006
- 2021
現在、自然言語処理の分野では、事前学習の技術が普及しています。
ProphetNetは事前学習に基づいた自然言語生成手法であり、英語テキストの要約や質問生成タスクにおいて強力なパフォーマンスを示している。
この論文では、ProphetNetを他のドメインや言語に拡張し、ProphetNetファミリーの事…
e4exp updated
3 years ago
-
I tried running the large model ([in a colab notebook](https://colab.research.google.com/drive/1Lw_kndsrorpSkD2hNy9mShyGYaXg3QX4?usp=sharing)) using the approach described in the [model card](https://…
-
### Feature request
Currently the output that you get from a pipeline seems to depend on the input type. While intuitively that makes sense for distinct primitive types, a difference also seems imple…
-
-
### System Info
```shell
Debian 11 on CPU, Python3.10
optimum : 1.13.1
onnx : 1.14.1
onnxruntime : 1.15.1
```
### Who can help?
@philschmid, @michaelbenayoun, @JingyaHuang, @echarlaix
### Info…
gidzr updated
10 months ago
-
# Model Overview
DialoGPT is a SOTA large-scale pretrained dialogue response generation model for multiturn conversations. The [human evaluation results](https://github.com/dreasysnail/Dialogpt_dev#h…
t4r7k updated
6 months ago
-
`psoni@blr2-lnxwk-071:~/DialoGPT$ sudo python3 interact.py --model_name_or_path ./models/medium --load_checkpoint ./models/medium/medium_ft.pkl --top_k 0
Found existing ./models folder, skip creating…
-
Minimal size, and RLHF pre-trained model which are with labels 'trl', 'text-generation-inference'
- https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- https://openlm.ai/chatbot-are…