hpcaitech / EnergonAI

Large-scale model inference.
Apache License 2.0
630 stars 90 forks source link

Not compatible with the latest version of transformers? (4.26.1) #192

Open skiingpacman opened 1 year ago

skiingpacman commented 1 year ago

When using transformers version 4.26.1 this import breaks from transformers.generation_logits_process import TopKLogitsWarper, TopPLogitsWarper, TemperatureLogitsWarper, LogitsProcessorList

I think it needs to be changed to from transformers.generation.logits_process import TopKLogitsWarper, TopPLogitsWarper, TemperatureLogitsWarper, LogitsProcessorList (dot instead of underscore after generation).

Either editing as above, or rolling back to transformers 4.24.0 resolves this import error.

(I have other errors that stop running the OPT inference example but likely unrelated.)

Yiran-Zhu commented 1 year ago

It could be a version compatibility issue. The problem was solved when I replaced source file from transformers import TopKLogitsWarper, TopPLogitsWarper, TemperatureLogitsWarper, LogitsProcessorList

binmakeswell commented 1 year ago

It could be a version compatibility issue. The problem was solved when I replaced source file from transformers import TopKLogitsWarper, TopPLogitsWarper, TemperatureLogitsWarper, LogitsProcessorList Hi @Yiran-Zhu Thank you very much for your contribution!