Alpha-VLLM / Lumina-T2X

Lumina-T2X is a unified framework for Text to Any Modality Generation
MIT License
2.04k stars 86 forks source link

\Lumina-Next-T2I\config.json' is not a valid JSON file. #74

Open danieldietzel opened 3 months ago

danieldietzel commented 3 months ago

I tried following the instructions here:

https://huggingface.co/Alpha-VLLM/Lumina-Next-T2I

And installed the repo via hugginface CLI and via github, in both cases I get this error:

[rank0]: OSError: It looks like the config file at '........\Lumina-T2X\Lumina-Next-T2I\config.json' is not a valid JSON file.

Using this command:

lumina_next infer -c "lumina_next_t2i/configs/infer/settings.yaml" "a snowman" "./outputs"

in my settings.yaml I've tried the local path, huggingface path, and repo download path.

PommesPeter commented 3 months ago

Hi @danieldietzel , Could you provide your yaml config? we will help you check the correctness

danieldietzel commented 3 months ago

Hi @danieldietzel , Could you provide your yaml config? we will help you check the correctness

Hi Pommes,

I ran through the steps again and realized the second item in this config is supposed to be the LLM checkpoint, not the lumina model.

https://github.com/Alpha-VLLM/Lumina-T2X/blob/main/lumina_next_t2i/configs/infer/settings.yaml

Now my Yaml is:

But I get this: TypeError: NextDiT.forward_with_cfg() got an unexpected keyword argument 'ntk_factor' [rank0]: AttributeError: 'NoneType' object has no attribute 'float'

I am on Windows by the way if it helps. Had to change all dist.init_process_group("nccl") to dist.init_process_group("gloo") to get this far not sure if that breaks things.

PommesPeter commented 3 months ago

This may not impact performance, but we have not tested whether it can run correctly on the Gloo backend. you could try running the mini version of Lumina-Next-T2I on https://github.com/Alpha-VLLM/Lumina-T2X/tree/main/lumina_next_t2i_mini