EveryVoiceTTS / EveryVoice

The EveryVoice TTS Toolkit - Text To Speech for your language
https://docs.everyvoice.ca
Other
21 stars 2 forks source link

TypeError: __init__() got an unexpected keyword argument 'sub_dir_callable' #200

Closed SamuelLarkin closed 11 months ago

SamuelLarkin commented 11 months ago

Trained a tiny system and now I'm trying to use it to test the invalid multispeaker id and I'm getting the following.

Code: 8908facc8

Command

everyvoice \
  synthesize \
    text-to-wav \
      logs_and_checkpoints/FeaturePredictionExperiment/base/checkpoints/last.ckpt \
      -a cpu \
      -d 1 \
      --vocoder-path ../generator_universal.pth.tar \
      --filelist error.psv

Notes

sub_dir_callable is part of the config files.

training:
  logger: {name: AlignerExperiment, save_dir: ../logs_and_checkpoints, sub_dir_callable: everyvoice.utils.get_current_time,
    version: base}

error.psv

Note that LJ002-0112 has speaker 44 when the model is only trained with speakers {0, 1, 2, 3}.

basename|text|Printing, in the only sense with which we are at present concerned, differs from most if not from all the arts and crafts represented in the Exhibition|language|speaker
LJ002-0112|The frequency and extent of processes against debtors seventy or eighty years ago will appear almost incredible|The frequency and extent of processes against debtors seventy or eighty years ago will appear almost incredible|str|44
LJ002-0079|Its name and its situation were the same as those of the old place of carrying out the terrible sentence inflicted on accused persons who stood mute.|Its name and its situation were the same as those of the old place of carrying out the terrible sentence inflicted on accused persons who stood mute.|git|0
LJ001-0047|Of Jenson it must be said that he carried the development of Roman type as far as it can go:|Of Jenson it must be said that he carried the development of Roman type as far as it can go:|str|3

Log

/home/sam037/.conda/envs/EveryVoice.sl/lib/python3.9/site-packages/torch/cuda/__init__.py:546: UserWarning: Can't initialize NVML
  warnings.warn("Can't initialize NVML")
2023-12-13 11:06:40.323 | INFO     | everyvoice.model.feature_prediction.FastSpeech2_lightning.fs2.cli:synthesize:404 - Loading checkpoint from logs_and_checkpoints/F
eaturePredictionExperiment/base/checkpoints/last.ckpt
2023-12-13 11:06:42.289 | INFO     | everyvoice.model.feature_prediction.FastSpeech2_lightning.fs2.cli:__init__:514 - Saving output to synthesis_output/synthesized_sp
ec
/home/sam037/.conda/envs/EveryVoice.sl/lib/python3.9/site-packages/lightning_fabric/plugins/environments/slurm.py:191: The `srun` command is available on your system
but is not used. HINT: If your intention is to run Lightning on SLURM, prepend your python command with `srun` like so: srun python /home/sam037/.conda/envs/EveryVoic
e.sl/bin/everyvoic ...
GPU available: False, used: False
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
HPU available: False, using: 0 HPUs
/home/sam037/.conda/envs/EveryVoice.sl/lib/python3.9/site-packages/lightning_fabric/plugins/environments/slurm.py:191: The `srun` command is available on your system
but is not used. HINT: If your intention is to run Lightning on SLURM, prepend your python command with `srun` like so: srun python /home/sam037/.conda/envs/EveryVoic
e.sl/bin/everyvoic ...
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /fs/hestia_Hnrc/ict/sam037/git/EveryVoice/everyvoice/model/feature_prediction/FastSpeech2_lightn │
│ ing/fs2/cli.py:646 in synthesize                                                                 │
│                                                                                                  │
│   643 │   │   │   │   )                                                                          │
│   644 │   │   │   ],                                                                             │
│   645 │   │   )                                                                                  │
│ ❱ 646 │   │   trainer.predict(model, data)                                                       │
│   647                                                                                            │
│   648                                                                                            │
│   649 if __name__ == "__main__":                                                                 │
│                                                                                                  │
│ /home/sam037/.conda/envs/EveryVoice.sl/lib/python3.9/site-packages/pytorch_lightning/trainer/tra │
│ iner.py:864 in predict                                                                           │
│                                                                                                  │
│    861 │   │   self.state.fn = TrainerFn.PREDICTING                                              │
│    862 │   │   self.state.status = TrainerStatus.RUNNING                                         │
│    863 │   │   self.predicting = True                                                            │
│ ❱  864 │   │   return call._call_and_handle_interrupt(                                           │
│    865 │   │   │   self, self._predict_impl, model, dataloaders, datamodule, return_predictions  │
│    866 │   │   )                                                                                 │
│    867                                                                                           │
│                                                                                                  │
│ /home/sam037/.conda/envs/EveryVoice.sl/lib/python3.9/site-packages/pytorch_lightning/trainer/cal │
│ l.py:44 in _call_and_handle_interrupt                                                            │
│                                                                                                  │
│    41 │   try:                                                                                   │
│    42 │   │   if trainer.strategy.launcher is not None:                                          │
│    43 │   │   │   return trainer.strategy.launcher.launch(trainer_fn, *args, trainer=trainer,    │
│ ❱  44 │   │   return trainer_fn(*args, **kwargs)                                                 │
│    45 │                                                                                          │
│    46 │   except _TunerExitException:                                                            │
│    47 │   │   _call_teardown_hook(trainer)                                                       │
│                                                                                                  │
│ /home/sam037/.conda/envs/EveryVoice.sl/lib/python3.9/site-packages/pytorch_lightning/trainer/tra │
│ iner.py:903 in _predict_impl                                                                     │
│                                                                                                  │
│    900 │   │   ckpt_path = self._checkpoint_connector._select_ckpt_path(                         │
│    901 │   │   │   self.state.fn, ckpt_path, model_provided=model_provided, model_connected=sel  │
│    902 │   │   )                                                                                 │
│ ❱  903 │   │   results = self._run(model, ckpt_path=ckpt_path)                                   │
│    904 │   │                                                                                     │
│    905 │   │   assert self.state.stopped                                                         │
│    906 │   │   self.predicting = False                                                           │
│                                                                                                  │
│ /home/sam037/.conda/envs/EveryVoice.sl/lib/python3.9/site-packages/pytorch_lightning/trainer/tra │
│ iner.py:950 in _run                                                                              │
│                                                                                                  │
│    947 │   │   self.strategy.setup_environment()                                                 │
│    948 │   │   self.__setup_profiler()                                                           │
│    949 │   │                                                                                     │
│ ❱  950 │   │   call._call_setup_hook(self)  # allow user to setup lightning_module in accelerat  │
│    951 │   │                                                                                     │
│    952 │   │   # check if we should delay restoring checkpoint till later                        │
│    953 │   │   if not self.strategy.restore_checkpoint_after_setup:                              │
│                                                                                                  │
│ /home/sam037/.conda/envs/EveryVoice.sl/lib/python3.9/site-packages/pytorch_lightning/trainer/cal │
│ l.py:86 in _call_setup_hook                                                                      │
│                                                                                                  │
│    83 │                                                                                          │
│    84 │   # Trigger lazy creation of experiment in loggers so loggers have their metadata avai   │
│    85 │   for logger in trainer.loggers:                                                         │
│ ❱  86 │   │   if hasattr(logger, "experiment"):                                                  │
│    87 │   │   │   _ = logger.experiment                                                          │
│    88 │                                                                                          │
│    89 │   trainer.strategy.barrier("pre_setup")                                                  │
│                                                                                                  │
│ /home/sam037/.conda/envs/EveryVoice.sl/lib/python3.9/site-packages/lightning_fabric/loggers/logg │
│ er.py:118 in experiment                                                                          │
│                                                                                                  │
│   115 │   │   """                                                                                │
│   116 │   │   if rank_zero_only.rank > 0:                                                        │
│   117 │   │   │   return _DummyExperiment()                                                      │
│ ❱ 118 │   │   return fn(self)                                                                    │
│   119 │                                                                                          │
│   120 │   return experiment                                                                      │
│   121                                                                                            │
│                                                                                                  │
│ /home/sam037/.conda/envs/EveryVoice.sl/lib/python3.9/site-packages/lightning_fabric/loggers/tens │
│ orboard.py:191 in experiment                                                                     │
│                                                                                                  │
│   188 │   │   else:                                                                              │
│   189 │   │   │   from tensorboardX import SummaryWriter  # type: ignore[no-redef]               │
│   190 │   │                                                                                      │
│ ❱ 191 │   │   self._experiment = SummaryWriter(log_dir=self.log_dir, **self._kwargs)             │
│   192 │   │   return self._experiment                                                            │
│   193 │                                                                                          │
│   194 │   @rank_zero_only                                                                        │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
TypeError: __init__() got an unexpected keyword argument 'sub_dir_callable'
roedoejet commented 11 months ago

ah yes, we need to change this line to:

tensorboard_logger = TensorBoardLogger(
            **(model.config.training.logger.model_dump(exclude={"sub_dir_callable"}))
        )