tyshiwo1 / DiM-DiffusionMamba

The official implementation of DiM: Diffusion Mamba for Efficient High-Resolution Image Synthesis
150 stars 8 forks source link

FATAL Flags parsing error: flag --config=None: Flag --config must have a value other than None. #7

Open Enternal-w opened 3 months ago

Enternal-w commented 3 months ago

Dear author, I have a problem that I can't solve

FATAL Flags parsing error: flag --config=None: Flag --config must have a value other than None. Pass --helpshort or --helpfull to see help on flags.

tyshiwo1 commented 3 months ago

Which commands did you use, training or evaluation? Please be as detailed as possible.

Enternal-w commented 3 months ago

training

tyshiwo1 commented 3 months ago

You executed accelerate launch --multi_gpu --num_processes 8 --mixed_precision fp16 ./train.py --config=configs/cifar10_S_DiM.py and then get this error?

tyshiwo1 commented 3 months ago

Do you have any other error messages surrounding this FATAL Flags parsing error: flag --config=None: Flag --config must have a value other than None. Pass --helpshort or --helpfull to see help on flags.?

Enternal-w commented 3 months ago

Thank you very much for your reply. As a newcomer, I don't quite understand the "accelerate launch..." where should this code be entered.Is your code running on Linux? I use pycharm in window.

tyshiwo1 commented 3 months ago

OK. I know.

accelerate is a library developed by huggingface. You just need to install it like pip install accelerate (which I have included in the environment.yaml)

Then, in the bash of your linux, you need to activate the installed conda environment conda activate mamba-attn, and then directly type in this accelerate launch...

tyshiwo1 commented 3 months ago

Here is the whole steps you need to follow, and I have included it in my readme:

# create env:
conda env create -f environment.yaml

# if you want to update the env `mamba` with the contents in `~/mamba_attn/environment.yaml`:
conda env update --name mamba --file ~/mamba_attn/environment.yaml --prune

# Switch to the correct environment
conda activate mamba-attn
conda install chardet

# Compiling Mamba. This step may take a lot of time, please be patient.
# You need to successfully install causal-conv1d first.
CAUSAL_CONV1D_FORCE_BUILD=TRUE pip install --user -e .
# If failing to compile, you can copy the files in './build/' from another server which has compiled successfully; Maybe --user is necessary.

# Optional: if you have only 8 A100 to train Huge model with a batch size of 768, I recommand to install deepspeed to reduce the required GPU memory:
pip install deepspeed
Enternal-w commented 3 months ago

Ok, thank you very much

tyshiwo1 commented 3 months ago

However, I haven't tested it on Windows. You can post your issues here and I will try my best to fix it.