Open danieldietzel opened 3 months ago
Hi @danieldietzel , Could you provide your yaml config? we will help you check the correctness
Hi @danieldietzel , Could you provide your yaml config? we will help you check the correctness
Hi Pommes,
I ran through the steps again and realized the second item in this config is supposed to be the LLM checkpoint, not the lumina model.
https://github.com/Alpha-VLLM/Lumina-T2X/blob/main/lumina_next_t2i/configs/infer/settings.yaml
Now my Yaml is:
settings:
model: ckpt: 'C:\models\lumina' ckpt_lm: 'C:\models\gemma' token: ""
transport: path_type: "Linear" # option: ["Linear", "GVP", "VP"] prediction: "velocity" # option: ["velocity", "score", "noise"] loss_weight: "velocity" # option: [None, "velocity", "likelihood"] sample_eps: 0.1 train_eps: 0.2
ode: atol: 1e-6 # Absolute tolerance rtol: 1e-3 # Relative tolerance reverse: false # option: true or false likelihood: false # option: true or false
infer: resolution: "1024x1024" # option: ["1024x1024", "512x2048", "2048x512", "(Extrapolation) 1664x1664", "(Extrapolation) 1024x2048", "(Extrapolation) 2048x1024"] num_sampling_steps: 60 # range: 1-1000 cfg_scale: 4. # range: 1-20 solver: "euler" # option: ["euler", "dopri5", "dopri8"] t_shift: 4 # range: 1-20 (int only) ntk_scaling: true # option: true or false proportional_attn: true # option: true or false seed: 0 # rnage: any number
But I get this:
TypeError: NextDiT.forward_with_cfg() got an unexpected keyword argument 'ntk_factor'
[rank0]: AttributeError: 'NoneType' object has no attribute 'float'
I am on Windows by the way if it helps. Had to change all dist.init_process_group("nccl")
to dist.init_process_group("gloo")
to get this far not sure if that breaks things.
This may not impact performance, but we have not tested whether it can run correctly on the Gloo backend. you could try running the mini version of Lumina-Next-T2I on https://github.com/Alpha-VLLM/Lumina-T2X/tree/main/lumina_next_t2i_mini
I tried following the instructions here:
https://huggingface.co/Alpha-VLLM/Lumina-Next-T2I
And installed the repo via hugginface CLI and via github, in both cases I get this error:
[rank0]: OSError: It looks like the config file at '........\Lumina-T2X\Lumina-Next-T2I\config.json' is not a valid JSON file.
Using this command:
lumina_next infer -c "lumina_next_t2i/configs/infer/settings.yaml" "a snowman" "./outputs"
in my settings.yaml I've tried the local path, huggingface path, and repo download path.