LambdaLabsML / examples

Deep Learning Examples
MIT License
805 stars 103 forks source link

ValueError: Input is not valid. Should be a string, a list/tuple of strings or a list/tuple of integers. SD fine tuning #57

Open pimentoliver opened 1 year ago

pimentoliver commented 1 year ago

Hello, I'm following the SD fine tuning tutorial. I ran with the Pokemon dataset and all was well, so I formatted my own dataset, edited the .yaml, forked the repo and am having this issue with my code when starting the first training epoch:

'ValueError: Input is not valid. Should be a string, a list/tuple of strings or a list/tuple of integers.'

Full traceback: `Traceback (most recent call last): File "/content/stable-diffusion/main.py", line 905, in trainer.fit(model, data)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/trainer/trainer.py", line 553, in fit self._run(model)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/trainer/trainer.py", line 918, in _run self._dispatch()

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/trainer/trainer.py", line 986, in _dispatch self.accelerator.start_training(self)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/accelerators/accelerator.py", line 92, in start_training self.training_type_plugin.start_training(trainer)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/plugins/training_type/training_type_plugin.py", line 161, in start_training self._results = trainer.run_stage()

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/trainer/trainer.py", line 996, in run_stage return self._run_train()

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/trainer/trainer.py", line 1045, in _run_train self.fit_loop.run()

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/base.py", line 111, in run self.advance(*args, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/fit_loop.py", line 200, in advance epoch_output = self.epoch_loop.run(train_dataloader)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/base.py", line 111, in run self.advance(*args, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/epoch/training_epoch_loop.py", line 130, in advance batch_output = self.batch_loop.run(batch, self.iteration_count, self._dataloader_idx)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/batch/training_batch_loop.py", line 101, in run super().run(batch, batch_idx, dataloader_idx)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/base.py", line 111, in run self.advance(*args, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/batch/training_batch_loop.py", line 148, in advance result = self._run_optimization(batch_idx, split_batch, opt_idx, optimizer)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/batch/training_batch_loop.py", line 202, in _run_optimization self._optimizer_step(optimizer, opt_idx, batch_idx, closure)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/batch/training_batch_loop.py", line 396, in _optimizer_step model_ref.optimizer_step( File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/core/lightning.py", line 1618, in optimizer_step optimizer.step(closure=optimizer_closure)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/core/optimizer.py", line 209, in step self.__optimizer_step(*args, closure=closure, profiler_name=profiler_name, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/core/optimizer.py", line 129, in __optimizer_step trainer.accelerator.optimizer_step(optimizer, self._optimizer_idx, lambda_closure=closure, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/accelerators/accelerator.py", line 296, in optimizer_step self.run_optimizer_step(optimizer, opt_idx, lambda_closure, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/accelerators/accelerator.py", line 303, in run_optimizer_step self.training_type_plugin.optimizer_step(optimizer, lambda_closure=lambda_closure, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/plugins/training_type/training_type_plugin.py", line 226, in optimizer_step optimizer.step(closure=lambda_closure, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/torch/optim/lr_scheduler.py", line 65, in wrapper return wrapped(*args, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/torch/optim/optimizer.py", line 113, in wrapper return func(*args, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/torch/optim/adamw.py", line 119, in step loss = closure()

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/batch/training_batch_loop.py", line 236, in _training_step_and_backward_closure result = self.training_step_and_backward(split_batch, batch_idx, opt_idx, optimizer, hiddens)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/batch/training_batch_loop.py", line 537, in training_step_and_backward result = self._training_step(split_batch, batch_idx, opt_idx, hiddens)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/loops/batch/training_batch_loop.py", line 307, in _training_step training_step_output = self.trainer.accelerator.training_step(step_kwargs)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/accelerators/accelerator.py", line 193, in training_step return self.training_type_plugin.training_step(*step_kwargs.values())

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/plugins/training_type/ddp.py", line 383, in training_step return self.model(*args, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/torch/nn/parallel/distributed.py", line 1008, in forward output = self._run_ddp_forward(*inputs, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/torch/nn/parallel/distributed.py", line 969, in _run_ddp_forward return module_to_run(*inputs[0], **kwargs[0])

File "/usr/local/lib/python3.9/dist-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs)

File "/usr/local/lib/python3.9/dist-packages/pytorch_lightning/overrides/base.py", line 82, in forward output = self.module.training_step(*inputs, **kwargs)

File "/content/stable-diffusion/ldm/models/diffusion/ddpm.py", line 406, in training_step loss, loss_dict = self.shared_step(batch)

File "/content/stable-diffusion/ldm/models/diffusion/ddpm.py", line 872, in shared_step x, c = self.get_input(batch, self.first_stage_key)

File "/usr/local/lib/python3.9/dist-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs)

File "/content/stable-diffusion/ldm/models/diffusion/ddpm.py", line 742, in get_input c = self.get_learned_conditioning(xc)

File "/content/stable-diffusion/ldm/models/diffusion/ddpm.py", line 619, in get_learned_conditioning c = self.cond_stage_model.encode(c)

File "/content/stable-diffusion/ldm/modules/encoders/modules.py", line 280, in encode return self(text)

File "/usr/local/lib/python3.9/dist-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs)

File "/content/stable-diffusion/ldm/modules/encoders/modules.py", line 271, in forward batch_encoding = self.tokenizer(text, truncation=True, max_length=self.max_length, return_length=True,

File "/usr/local/lib/python3.9/dist-packages/transformers/tokenization_utils_base.py", line 2484, in call encodings = self._call_one(text=text, text_pair=text_pair, **all_kwargs)

File "/usr/local/lib/python3.9/dist-packages/transformers/tokenization_utils_base.py", line 2570, in _call_one return self.batch_encode_plus(

File "/usr/local/lib/python3.9/dist-packages/transformers/tokenization_utils_base.py", line 2761, in batch_encode_plus return self._batch_encode_plus(

File "/usr/local/lib/python3.9/dist-packages/transformers/tokenization_utils.py", line 733, in _batch_encode_plus first_ids = get_input_ids(ids)

File "/usr/local/lib/python3.9/dist-packages/transformers/tokenization_utils.py", line 713, in get_input_ids raise ValueError(

ValueError: Input is not valid. Should be a string, a list/tuple of strings or a list/tuple of integers.`

For full context, this is my dataset, formatted to be structured the same as the Pokemon dataset - https://huggingface.co/datasets/pimentooliver/fungi_futures

And here is my modified script -

!(python main.py \ -t \ --base /content/stable-diffusion/configs/stable-diffusion/rewrite_yaml.yaml \ --gpus "$gpu_list" \ --scale_lr False \ --num_nodes 1 \ --check_val_every_n_epoch 10 \ --finetune_from "$ckpt_path" \ data.params.batch_size="$BATCH_SIZE" \ lightning.trainer.accumulate_grad_batches="$ACCUMULATE_BATCHES" \ data.params.validation.params.n_gpus="$N_GPUS" \ )

Any advice much appreciated, thank you.

lvsi-qi commented 1 year ago

Hello, how did you generate your parquet file? After my parquet file is uploaded, the image format is string format. How can I convert it to image format? 微信图片_20230415011130

pimentoliver commented 1 year ago

Hello, I've just seen this and don't have access to my laptop right now, but will send you my code when I can. I'm a bit forgetful, so if I don't respond, feel free to ask again.Alternatively, visit the colab notebooks in my repo /fungal-futures, the code to refer to should be in PT1_DatasetFormatting.On Apr 14, 2023, at 6:13 PM, lvsi-qi @.***> wrote: Hello, how did you generate your parquet file? After my parquet file is uploaded, the image format is string format. How can I convert it to image format?

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you authored the thread.Message ID: @.***>

lvsi-qi commented 1 year ago

您好,我刚刚看到这个,现在无法访问我的笔记本电脑,但我会在可能的时候向您发送我的代码。我有点健忘,所以如果我没有回应,请随时再问一次。或者,访问我的回购/fungal-futures中的colab笔记本,要引用的代码应该在PT1_DatasetFormatting.14年2023月6日下午13:<>,lvsi-qi @.>写道: 您好,您是如何生成镶木地板文件的?我的镶木地板文件上传后,图像格式为字符串格式。如何将其转换为图像格式? - 直接回复此电子邮件,在 GitHub 上查看或取消订阅。您收到此消息是因为您创作了线程。消息 ID:@.>

Hello,Could I add your personal contact information? I have some questions to ask you.I will thank you very much