ymcui / Chinese-LLaMA-Alpaca-2

中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
Apache License 2.0
7.11k stars 574 forks source link

How can I output generation scores(logits)? #541

Closed Sishxo closed 7 months ago

Sishxo commented 8 months ago

Check before submitting issues

Type of Issue

Model inference

Base Model

Chinese-Alpaca-2 (7B/13B)

Operating System

Linux

Describe your issue in detail

This is my generation config

generation_config = GenerationConfig(
    temperature=0.1,
    top_k=20,
    top_p=0.9,
    do_sample=True,
    num_beams=1,
    repetition_penalty=1.0,
    max_new_tokens=400,
    output_scores=True,
)

However, It didn't output logits as expected

Dependencies (must be provided for code-related issues)

None

Execution logs or screenshots

None

github-actions[bot] commented 8 months ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.

github-actions[bot] commented 7 months ago

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.