Closed bluenevus closed 3 months ago
Same here!
same, both from docker and built from source
in app.py, try updating this
def build_models(model_type, config, enable_optimization=False):
# ...
# Remove or correct the 'force_huggingface' argument
model_kwargs = {k: v for k, v in config.model.items() if k not in ("type", "from_pretrained", "force_huggingface")}
stdit = STDiT3.from_pretrained(HF_STDIT_MAP[model_type], **model_kwargs)
#...
running on docker instance and I'm getting this error when running. I took these steps to install
git clone https://github.com/hpcaitech/Open-Sora.git cd Open-Sora docker build -t opensora . docker run -ti --gpus '"device=2,3"' -v .:/workspace/Open-Sora opensora export OPENAI_API_KEY=myopenapi key
I tried this first python gradio/app.py
Then I rebuilt the container and tried this python gradio/app.py --enable-optimization
I'm getting these errors.
Traceback (most recent call last): File "/workspace/Open-Sora/gradio/app.py", line 195, in
vae, text_encoder, stdit, scheduler = build_models(
File "/workspace/Open-Sora/gradio/app.py", line 104, in build_models
stdit = STDiT3.from_pretrained(HF_STDIT_MAP[model_type], *model_kwargs)
File "/opt/conda/envs/pytorch/lib/python3.9/site-packages/transformers/modeling_utils.py", line 3462, in from_pretrained
model = cls(config, model_args, **model_kwargs)
TypeError: init() got an unexpected keyword argument 'force_huggingface'