----------0----------
2023-04-28 00:02:48,257 - INFO - Start from pretrain model: model_pretrain/maxgan_pretrain_16K_5L.pth
2023-04-28 00:02:48,695 - INFO - Starting new training run.
----------0----------
Traceback (most recent call last):
File "C:\ai\lora-svc\lora-svc-16k\svc_trainer.py", line 44, in <module>
train(0, args, args.checkpoint_path, hp, hp_str)
File "C:\ai\lora-svc\lora-svc-16k\utils\train.py", line 107, in train
trainloader = create_dataloader(hp, True)
File "C:\ai\lora-svc\lora-svc-16k\utils\dataloader.py", line 35, in create_dataloader
return DataLoader(dataset=dataset, batch_size=hp.train.batch_size, shuffle=True,
File "C:\ai\lora-svc\venv\lib\site-packages\torch\utils\data\dataloader.py", line 344, in __init__
sampler = RandomSampler(dataset, generator=generator) # type: ignore[arg-type]
File "C:\ai\lora-svc\venv\lib\site-packages\torch\utils\data\sampler.py", line 107, in __init__
raise ValueError("num_samples should be a positive integer "
ValueError: num_samples should be a positive integer value, but got num_samples=0
Batch size per GPU : 16