jaketae / storyteller

Multimodal AI Story Teller, built with Stable Diffusion, GPT, and neural text-to-speech
MIT License
482 stars 64 forks source link

storyteller failed, ? #17

Closed bk111 closed 1 year ago

bk111 commented 1 year ago

C:\Users\dengz\Downloads\story>storyteller [nltk_data] Downloading package punkt to [nltk_data] C:\Users\dengz\AppData\Roaming\nltk_data... [nltk_data] Unzipping tokenizers\punkt.zip. Downloading: 100%|█████████████████████████████████████████████████████████████████████| 665/665 [00:00<00:00, 168kB/s] Downloading: 72%|████████████████████████████████████████████████▎ | 395M/548M [12:20<04:47, 533kB/s] Traceback (most recent call last): File "C:\Python310\lib\runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Python310\lib\runpy.py", line 86, in _run_code exec(code, run_globals) File "C:\Python310\Scripts\storyteller.exe__main__.py", line 7, in File "C:\Python310\lib\site-packages\storyteller\cli.py", line 39, in main story_teller = StoryTeller(config) File "C:\Python310\lib\site-packages\storyteller\utils.py", line 22, in wrapper_func func(*args, *kwargs) File "C:\Python310\lib\site-packages\storyteller\utils.py", line 36, in wrapper_func func(args, **kwargs) File "C:\Python310\lib\site-packages\storyteller\model.py", line 28, in init self.writer = pipeline( File "C:\Python310\lib\site-packages\transformers\pipelines__init__.py", line 724, in pipeline framework, model = infer_framework_load_model( File "C:\Python310\lib\site-packages\transformers\pipelines\base.py", line 266, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model gpt2 with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.gpt2.modeling_gpt2.GPT2LMHeadModel'>).

bk111 commented 1 year ago

reboot pc, then

C:\Users\dengz\Downloads\story>storyteller Traceback (most recent call last): File "C:\Python310\lib\runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Python310\lib\runpy.py", line 86, in _run_code exec(code, run_globals) File "C:\Python310\Scripts\storyteller.exe__main__.py", line 7, in File "C:\Python310\lib\site-packages\storyteller\cli.py", line 39, in main story_teller = StoryTeller(config) File "C:\Python310\lib\site-packages\storyteller\utils.py", line 22, in wrapper_func func(*args, *kwargs) File "C:\Python310\lib\site-packages\storyteller\utils.py", line 36, in wrapper_func func(args, **kwargs) File "C:\Python310\lib\site-packages\storyteller\model.py", line 28, in init self.writer = pipeline( File "C:\Python310\lib\site-packages\transformers\pipelines__init__.py", line 724, in pipeline framework, model = infer_framework_load_model( File "C:\Python310\lib\site-packages\transformers\pipelines\base.py", line 266, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model gpt2 with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.gpt2.modeling_gpt2.GPT2LMHeadModel'>).

jaketae commented 1 year ago

Hello @bk111, thanks for opening this issue, and apologies for the late reply.

Based on the logs, it seems that the error is coming from Hugging Face transformers, not StoryTeller. Can you make sure that there is enough space on your disk, and that the GPT 2 model has been fully downloaded?

stale[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.