issues
search
LowinLi
/
transformers-stream-generator
This is a text generation method which returns a generator, streaming out each token in real-time during inference, based on Huggingface/Transformers.
MIT License
96
stars
14
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
cannot import name 'SampleOutput' from 'transformers.generation.utils'
#13
dijkstra-mose
closed
6 months ago
1
dict
#12
ziyue246
opened
8 months ago
0
ERROR: Could not build wheels for transformers-stream-generator, which is required to install pyproject.toml-based projects
#11
Lizhecheng02
opened
1 year ago
0
sample_stream has errors when eos_token_id is a list more than one elements
#10
llltttppp
opened
1 year ago
0
Repair is_xxx_mode with do_stream in NewGenerationMixin.generate
#9
AIxyz
closed
6 months ago
0
generate 函数中 is_greedy_gen_mode 和 is_sample_gen_stream_mode 同时为 True
#8
AIxyz
opened
1 year ago
1
Token Yielding Problem
#7
xdevfaheem
opened
1 year ago
1
中文乱码
#6
lucasjinreal
opened
1 year ago
5
transformers_stream_generator/main.py:139: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation) warnings.warn(
#5
lucasjinreal
opened
1 year ago
1
支持bloom吗
#4
kevinuserdd
opened
1 year ago
1
The generator doesn't work if set num_beams not equals 1
#3
zoubaihan
opened
1 year ago
3
spacing problem.
#2
circuluspibo
opened
1 year ago
4
do_sample=False
#1
vicwer
opened
1 year ago
2