Closed liyunlongaaa closed 1 year ago
It is note that Extract embeddings ouptut "Fail" and "Success"...maybe is key..
It is note that Extract embeddings ouptut "Fail" and "Success"...maybe is key..
Yeah ... check the embedding extracting log ...
but it seem the .sh is "if else",why they will output together...
but it seem the .sh is "if else",why they will output together...
oh I see !
Let me reserch a reserch
sorry, Why is it successful to extract eval embedding in the first stage of training (without LM), but it fails to extract eval embedding in the LM stage? Is there any difference between them?
It does not matter with LM. Could you paste the failure log?
Thank you for your concern. In order to verify that the whole process can run smoothly, I re-run the script, and only trained 6 epochs in the first training, and only 2 trainings in the LM stage. Then this time they all had problems in the extraction eval stage, I was helpless.
Preparing datasets ...
Prepare wav.scp for each dataset ...
Prepare train data including CN-Celeb_wav/dev and CN-Celeb2_wav ...
Prepare data for testing ...
Prepare data for enroll ...
Prepare evalution trials ...
Success !!! Now data preparation is done !!!
Covert train and test data to raw...
Start training ...
[ INFO : 2022-12-03 01:00:31,481 ] - exp_dir is: exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6
[ INFO : 2022-12-03 01:00:31,481 ] - <== Passed Arguments ==>
[ INFO : 2022-12-03 01:00:31,481 ] - {'data_type': 'raw',
[ INFO : 2022-12-03 01:00:31,481 ] - 'dataloader_args': {'batch_size': 256,
[ INFO : 2022-12-03 01:00:31,481 ] - 'drop_last': True,
[ INFO : 2022-12-03 01:00:31,481 ] - 'num_workers': 16,
[ INFO : 2022-12-03 01:00:31,481 ] - 'pin_memory': False,
[ INFO : 2022-12-03 01:00:31,481 ] - 'prefetch_factor': 8},
[ INFO : 2022-12-03 01:00:31,481 ] - 'dataset_args': {'aug_prob': 0.6,
[ INFO : 2022-12-03 01:00:31,481 ] - 'fbank_args': {'dither': 1.0,
[ INFO : 2022-12-03 01:00:31,481 ] - 'frame_length': 25,
[ INFO : 2022-12-03 01:00:31,481 ] - 'frame_shift': 10,
[ INFO : 2022-12-03 01:00:31,481 ] - 'num_mel_bins': 80},
[ INFO : 2022-12-03 01:00:31,482 ] - 'num_frms': 200,
[ INFO : 2022-12-03 01:00:31,482 ] - 'resample_rate': 16000,
[ INFO : 2022-12-03 01:00:31,482 ] - 'shuffle': True,
[ INFO : 2022-12-03 01:00:31,482 ] - 'shuffle_args': {'shuffle_size': 2500},
[ INFO : 2022-12-03 01:00:31,482 ] - 'spec_aug': False,
[ INFO : 2022-12-03 01:00:31,482 ] - 'spec_aug_args': {'max_f': 8,
[ INFO : 2022-12-03 01:00:31,482 ] - 'max_t': 10,
[ INFO : 2022-12-03 01:00:31,482 ] - 'num_f_mask': 1,
[ INFO : 2022-12-03 01:00:31,482 ] - 'num_t_mask': 1,
[ INFO : 2022-12-03 01:00:31,482 ] - 'prob': 0.6},
[ INFO : 2022-12-03 01:00:31,482 ] - 'speed_perturb': True},
[ INFO : 2022-12-03 01:00:31,482 ] - 'enable_amp': False,
[ INFO : 2022-12-03 01:00:31,482 ] - 'exp_dir': 'exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6',
[ INFO : 2022-12-03 01:00:31,482 ] - 'gpus': [0],
[ INFO : 2022-12-03 01:00:31,482 ] - 'log_batch_interval': 100,
[ INFO : 2022-12-03 01:00:31,482 ] - 'loss': 'CrossEntropyLoss',
[ INFO : 2022-12-03 01:00:31,482 ] - 'loss_args': {},
[ INFO : 2022-12-03 01:00:31,482 ] - 'margin_scheduler': 'MarginScheduler',
[ INFO : 2022-12-03 01:00:31,482 ] - 'margin_update': {'final_margin': 0.2,
[ INFO : 2022-12-03 01:00:31,482 ] - 'fix_start_epoch': 40,
[ INFO : 2022-12-03 01:00:31,482 ] - 'increase_start_epoch': 20,
[ INFO : 2022-12-03 01:00:31,482 ] - 'increase_type': 'exp',
[ INFO : 2022-12-03 01:00:31,482 ] - 'initial_margin': 0.0,
[ INFO : 2022-12-03 01:00:31,482 ] - 'update_margin': True},
[ INFO : 2022-12-03 01:00:31,482 ] - 'model': 'ResNet34',
[ INFO : 2022-12-03 01:00:31,482 ] - 'model_args': {'embed_dim': 256,
[ INFO : 2022-12-03 01:00:31,482 ] - 'feat_dim': 80,
[ INFO : 2022-12-03 01:00:31,482 ] - 'pooling_func': 'TSTP',
[ INFO : 2022-12-03 01:00:31,482 ] - 'two_emb_layer': False},
[ INFO : 2022-12-03 01:00:31,482 ] - 'model_init': None,
[ INFO : 2022-12-03 01:00:31,482 ] - 'noise_data': '/home/yoos/Documents/data/musan/lmdb',
[ INFO : 2022-12-03 01:00:31,482 ] - 'num_avg': 2,
[ INFO : 2022-12-03 01:00:31,482 ] - 'num_epochs': 6,
[ INFO : 2022-12-03 01:00:31,482 ] - 'optimizer': 'SGD',
[ INFO : 2022-12-03 01:00:31,482 ] - 'optimizer_args': {'momentum': 0.9, 'nesterov': True, 'weight_decay': 0.0001},
[ INFO : 2022-12-03 01:00:31,482 ] - 'projection_args': {'easy_margin': False,
[ INFO : 2022-12-03 01:00:31,482 ] - 'project_type': 'arc_margin',
[ INFO : 2022-12-03 01:00:31,482 ] - 'scale': 32.0},
[ INFO : 2022-12-03 01:00:31,482 ] - 'reverb_data': '/home/yoos/Documents/data/rirs/lmdb',
[ INFO : 2022-12-03 01:00:31,482 ] - 'save_epoch_interval': 5,
[ INFO : 2022-12-03 01:00:31,482 ] - 'scheduler': 'ExponentialDecrease',
[ INFO : 2022-12-03 01:00:31,482 ] - 'scheduler_args': {'final_lr': 5e-05,
[ INFO : 2022-12-03 01:00:31,482 ] - 'initial_lr': 0.1,
[ INFO : 2022-12-03 01:00:31,482 ] - 'warm_from_zero': True,
[ INFO : 2022-12-03 01:00:31,482 ] - 'warm_up_epoch': 6},
[ INFO : 2022-12-03 01:00:31,482 ] - 'seed': 42,
[ INFO : 2022-12-03 01:00:31,482 ] - 'train_data': '/home/yoos/Documents/data/cnceleb_train/raw.list',
[ INFO : 2022-12-03 01:00:31,482 ] - 'train_label': '/home/yoos/Documents/data/cnceleb_train/utt2spk'}
[ INFO : 2022-12-03 01:00:31,969 ] - <== Data statistics ==>
[ INFO : 2022-12-03 01:00:31,969 ] - train data num: 519590, spk num: 2793
[ INFO : 2022-12-03 01:00:32,046 ] - <== Dataloaders ==>
[ INFO : 2022-12-03 01:00:32,046 ] - train dataloaders created
[ INFO : 2022-12-03 01:00:32,046 ] - loader size: 2029
[ INFO : 2022-12-03 01:00:32,046 ] - <== Model ==>
[ INFO : 2022-12-03 01:00:32,122 ] - speaker_model size: 6634336
[ INFO : 2022-12-03 01:00:32,122 ] - Train model from scratch ...
[ INFO : 2022-12-03 01:00:32,134 ] - ResNet(
[ INFO : 2022-12-03 01:00:32,134 ] - (conv1): Conv2d(1, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn1): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (layer1): Sequential(
[ INFO : 2022-12-03 01:00:32,134 ] - (0): BasicBlock(
[ INFO : 2022-12-03 01:00:32,134 ] - (conv1): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn1): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (conv2): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn2): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,134 ] - )
[ INFO : 2022-12-03 01:00:32,134 ] - (1): BasicBlock(
[ INFO : 2022-12-03 01:00:32,134 ] - (conv1): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn1): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (conv2): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn2): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,134 ] - )
[ INFO : 2022-12-03 01:00:32,134 ] - (2): BasicBlock(
[ INFO : 2022-12-03 01:00:32,134 ] - (conv1): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn1): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (conv2): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn2): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,134 ] - )
[ INFO : 2022-12-03 01:00:32,134 ] - )
[ INFO : 2022-12-03 01:00:32,134 ] - (layer2): Sequential(
[ INFO : 2022-12-03 01:00:32,134 ] - (0): BasicBlock(
[ INFO : 2022-12-03 01:00:32,134 ] - (conv1): Conv2d(32, 64, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (shortcut): Sequential(
[ INFO : 2022-12-03 01:00:32,134 ] - (0): Conv2d(32, 64, kernel_size=(1, 1), stride=(2, 2), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - )
[ INFO : 2022-12-03 01:00:32,134 ] - )
[ INFO : 2022-12-03 01:00:32,134 ] - (1): BasicBlock(
[ INFO : 2022-12-03 01:00:32,134 ] - (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,134 ] - )
[ INFO : 2022-12-03 01:00:32,134 ] - (2): BasicBlock(
[ INFO : 2022-12-03 01:00:32,134 ] - (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,134 ] - (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,134 ] - (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - (3): BasicBlock(
[ INFO : 2022-12-03 01:00:32,135 ] - (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - (layer3): Sequential(
[ INFO : 2022-12-03 01:00:32,135 ] - (0): BasicBlock(
[ INFO : 2022-12-03 01:00:32,135 ] - (conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (shortcut): Sequential(
[ INFO : 2022-12-03 01:00:32,135 ] - (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - (1): BasicBlock(
[ INFO : 2022-12-03 01:00:32,135 ] - (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - (2): BasicBlock(
[ INFO : 2022-12-03 01:00:32,135 ] - (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - (3): BasicBlock(
[ INFO : 2022-12-03 01:00:32,135 ] - (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - (4): BasicBlock(
[ INFO : 2022-12-03 01:00:32,135 ] - (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - (5): BasicBlock(
[ INFO : 2022-12-03 01:00:32,135 ] - (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,135 ] - (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,135 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - )
[ INFO : 2022-12-03 01:00:32,135 ] - (layer4): Sequential(
[ INFO : 2022-12-03 01:00:32,135 ] - (0): BasicBlock(
[ INFO : 2022-12-03 01:00:32,135 ] - (conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,136 ] - (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,136 ] - (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,136 ] - (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,136 ] - (shortcut): Sequential(
[ INFO : 2022-12-03 01:00:32,136 ] - (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
[ INFO : 2022-12-03 01:00:32,136 ] - (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,136 ] - )
[ INFO : 2022-12-03 01:00:32,136 ] - )
[ INFO : 2022-12-03 01:00:32,136 ] - (1): BasicBlock(
[ INFO : 2022-12-03 01:00:32,136 ] - (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,136 ] - (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,136 ] - (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,136 ] - (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,136 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,136 ] - )
[ INFO : 2022-12-03 01:00:32,136 ] - (2): BasicBlock(
[ INFO : 2022-12-03 01:00:32,136 ] - (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,136 ] - (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,136 ] - (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
[ INFO : 2022-12-03 01:00:32,136 ] - (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2022-12-03 01:00:32,136 ] - (shortcut): Sequential()
[ INFO : 2022-12-03 01:00:32,136 ] - )
[ INFO : 2022-12-03 01:00:32,136 ] - )
[ INFO : 2022-12-03 01:00:32,136 ] - (pool): TSTP()
[ INFO : 2022-12-03 01:00:32,136 ] - (seg_1): Linear(in_features=5120, out_features=256, bias=True)
[ INFO : 2022-12-03 01:00:32,136 ] - (seg_bn_1): Identity()
[ INFO : 2022-12-03 01:00:32,136 ] - (seg_2): Identity()
[ INFO : 2022-12-03 01:00:32,136 ] - (projection): ArcMarginProduct(
[ INFO : 2022-12-03 01:00:32,136 ] - in_features=256, out_features=8379, scale=32.0,
[ INFO : 2022-12-03 01:00:32,136 ] - margin=0.0, easy_margin=False
[ INFO : 2022-12-03 01:00:32,136 ] - )
[ INFO : 2022-12-03 01:00:32,136 ] - )
[ INFO : 2022-12-03 01:00:32,410 ] - start_epoch: 1
[ INFO : 2022-12-03 01:00:32,420 ] - <== Loss ==>
[ INFO : 2022-12-03 01:00:32,420 ] - loss criterion is: CrossEntropyLoss
[ INFO : 2022-12-03 01:00:32,420 ] - <== Optimizer ==>
[ INFO : 2022-12-03 01:00:32,420 ] - optimizer is: SGD
[ INFO : 2022-12-03 01:00:32,420 ] - <== Scheduler ==>
[ INFO : 2022-12-03 01:00:32,420 ] - scheduler is: ExponentialDecrease
[ INFO : 2022-12-03 01:00:32,421 ] - <== MarginScheduler ==>
[ INFO : 2022-12-03 01:00:32,422 ] - <========== Training process ==========>
[ INFO : 2022-12-03 01:00:32,422 ] - +----------+----------+----------+----------+----------+----------+
[ INFO : 2022-12-03 01:00:32,423 ] - | Epoch| Batch| Lr| Margin| Loss| Acc|
[ INFO : 2022-12-03 01:00:32,423 ] - +----------+----------+----------+----------+----------+----------+
[ INFO : 2022-12-03 01:00:46,074 ] - Reducer buckets have been rebuilt in this iteration.
[ INFO : 2022-12-03 01:01:37,963 ] - | 1| 100| 0.0030579| 0| 9.5142| 0.11719|
[ INFO : 2022-12-03 01:02:30,460 ] - | 1| 200| 0.0057746| 0| 9.3304| 0.15234|
[ INFO : 2022-12-03 01:03:21,603 ] - | 1| 300| 0.0081512| 0| 9.294| 0.1849|
[ INFO : 2022-12-03 01:04:12,906 ] - | 1| 400| 0.010219| 0| 9.2092| 0.26953|
[ INFO : 2022-12-03 01:05:04,002 ] - | 1| 500| 0.012007| 0| 9.079| 0.43672|
[ INFO : 2022-12-03 01:05:55,343 ] - | 1| 600| 0.01354| 0| 8.9016| 0.74935|
[ INFO : 2022-12-03 01:06:46,647 ] - | 1| 700| 0.014845| 0| 8.6881| 1.1568|
[ INFO : 2022-12-03 01:07:37,732 ] - | 1| 800| 0.015941| 0| 8.4551| 1.8032|
[ INFO : 2022-12-03 01:08:29,148 ] - | 1| 900| 0.016851| 0| 8.216| 2.6393|
[ INFO : 2022-12-03 01:09:20,491 ] - | 1| 1000| 0.017592| 0| 7.9755| 3.6691|
[ INFO : 2022-12-03 01:10:11,564 ] - | 1| 1100| 0.018181| 0| 7.7428| 4.8168|
[ INFO : 2022-12-03 01:11:02,888 ] - | 1| 1200| 0.018635| 0| 7.5176| 6.0501|
[ INFO : 2022-12-03 01:11:54,175 ] - | 1| 1300| 0.018967| 0| 7.3034| 7.3468|
[ INFO : 2022-12-03 01:12:45,351 ] - | 1| 1400| 0.019191| 0| 7.0991| 8.7079|
[ INFO : 2022-12-03 01:13:36,718 ] - | 1| 1500| 0.019318| 0| 6.9121| 10.009|
[ INFO : 2022-12-03 01:14:27,865 ] - | 1| 1600| 0.01936| 0| 6.7324| 11.345|
[ INFO : 2022-12-03 01:15:19,169 ] - | 1| 1700| 0.019325| 0| 6.5631| 12.673|
[ INFO : 2022-12-03 01:16:10,515 ] - | 1| 1800| 0.019224| 0| 6.404| 13.945|
[ INFO : 2022-12-03 01:17:01,597 ] - | 1| 1900| 0.019065| 0| 6.2542| 15.198|
[ INFO : 2022-12-03 01:17:52,082 ] - | 1| 2000| 0.018854| 0| 6.1146| 16.4|
[ INFO : 2022-12-03 01:18:00,444 ] - | 1| 2016| 0.018816| 0| 6.0926| 16.595|
[ INFO : 2022-12-03 01:19:02,188 ] - | 2| 100| 0.018518| 0| 3.2682| 41.699|
[ INFO : 2022-12-03 01:19:54,676 ] - | 2| 200| 0.018214| 0| 3.2327| 42.465|
[ INFO : 2022-12-03 01:20:45,859 ] - | 2| 300| 0.01788| 0| 3.1874| 43.059|
[ INFO : 2022-12-03 01:21:37,244 ] - | 2| 400| 0.017519| 0| 3.137| 43.855|
[ INFO : 2022-12-03 01:22:28,315 ] - | 2| 500| 0.017137| 0| 3.0917| 44.545|
[ INFO : 2022-12-03 01:23:19,624 ] - | 2| 600| 0.016736| 0| 3.0502| 45.173|
[ INFO : 2022-12-03 01:24:11,010 ] - | 2| 700| 0.016322| 0| 3.0144| 45.731|
[ INFO : 2022-12-03 01:25:02,108 ] - | 2| 800| 0.015896| 0| 2.9739| 46.36|
[ INFO : 2022-12-03 01:25:53,435 ] - | 2| 900| 0.015462| 0| 2.9347| 46.975|
[ INFO : 2022-12-03 01:26:44,819 ] - | 2| 1000| 0.015022| 0| 2.9013| 47.508|
[ INFO : 2022-12-03 01:27:35,949 ] - | 2| 1100| 0.014579| 0| 2.8674| 48.044|
[ INFO : 2022-12-03 01:28:27,290 ] - | 2| 1200| 0.014134| 0| 2.8337| 48.616|
[ INFO : 2022-12-03 01:29:18,582 ] - | 2| 1300| 0.01369| 0| 2.8027| 49.127|
[ INFO : 2022-12-03 01:30:09,646 ] - | 2| 1400| 0.013248| 0| 2.772| 49.648|
[ INFO : 2022-12-03 01:31:00,963 ] - | 2| 1500| 0.012809| 0| 2.7433| 50.107|
[ INFO : 2022-12-03 01:31:52,083 ] - | 2| 1600| 0.012375| 0| 2.7157| 50.564|
[ INFO : 2022-12-03 01:32:43,329 ] - | 2| 1700| 0.011946| 0| 2.6884| 51.007|
[ INFO : 2022-12-03 01:33:34,604 ] - | 2| 1800| 0.011524| 0| 2.6608| 51.464|
[ INFO : 2022-12-03 01:34:25,705 ] - | 2| 1900| 0.01111| 0| 2.6345| 51.904|
[ INFO : 2022-12-03 01:35:16,208 ] - | 2| 2000| 0.010703| 0| 2.6112| 52.283|
[ INFO : 2022-12-03 01:35:24,566 ] - | 2| 2016| 0.010639| 0| 2.6075| 52.341|
[ INFO : 2022-12-03 01:36:27,487 ] - | 3| 100| 0.010191| 0| 2.0719| 60.973|
[ INFO : 2022-12-03 01:37:19,817 ] - | 3| 200| 0.0098045| 0| 2.0588| 61.096|
[ INFO : 2022-12-03 01:38:10,974 ] - | 3| 300| 0.0094275| 0| 2.0516| 61.328|
[ INFO : 2022-12-03 01:39:02,306 ] - | 3| 400| 0.0090602| 0| 2.0358| 61.663|
[ INFO : 2022-12-03 01:39:53,405 ] - | 3| 500| 0.0087027| 0| 2.0186| 61.976|
[ INFO : 2022-12-03 01:40:44,760 ] - | 3| 600| 0.0083554| 0| 2.0117| 62.118|
[ INFO : 2022-12-03 01:41:36,076 ] - | 3| 700| 0.0080183| 0| 1.9992| 62.345|
[ INFO : 2022-12-03 01:42:27,158 ] - | 3| 800| 0.0076913| 0| 1.9914| 62.479|
[ INFO : 2022-12-03 01:43:18,454 ] - | 3| 900| 0.0073745| 0| 1.9811| 62.674|
[ INFO : 2022-12-03 01:44:09,839 ] - | 3| 1000| 0.0070679| 0| 1.9697| 62.87|
[ INFO : 2022-12-03 01:45:01,019 ] - | 3| 1100| 0.0067715| 0| 1.9588| 63.08|
[ INFO : 2022-12-03 01:45:52,399 ] - | 3| 1200| 0.006485| 0| 1.9517| 63.211|
[ INFO : 2022-12-03 01:46:43,692 ] - | 3| 1300| 0.0062083| 0| 1.9404| 63.41|
[ INFO : 2022-12-03 01:47:34,875 ] - | 3| 1400| 0.0059415| 0| 1.9295| 63.613|
[ INFO : 2022-12-03 01:48:26,163 ] - | 3| 1500| 0.0056841| 0| 1.9191| 63.799|
[ INFO : 2022-12-03 01:49:17,371 ] - | 3| 1600| 0.0054362| 0| 1.9113| 63.962|
[ INFO : 2022-12-03 01:50:08,602 ] - | 3| 1700| 0.0051974| 0| 1.9019| 64.141|
[ INFO : 2022-12-03 01:50:59,880 ] - | 3| 1800| 0.0049677| 0| 1.8936| 64.307|
[ INFO : 2022-12-03 01:51:50,884 ] - | 3| 1900| 0.0047467| 0| 1.8869| 64.438|
[ INFO : 2022-12-03 01:52:41,368 ] - | 3| 2000| 0.0045342| 0| 1.8785| 64.612|
[ INFO : 2022-12-03 01:52:49,736 ] - | 3| 2016| 0.004501| 0| 1.8763| 64.647|
[ INFO : 2022-12-03 01:53:51,993 ] - | 4| 100| 0.0042725| 0| 1.66| 68.27|
[ INFO : 2022-12-03 01:54:44,457 ] - | 4| 200| 0.0040787| 0| 1.6715| 68.031|
[ INFO : 2022-12-03 01:55:35,629 ] - | 4| 300| 0.0038928| 0| 1.6694| 68.164|
[ INFO : 2022-12-03 01:56:26,931 ] - | 4| 400| 0.0037145| 0| 1.661| 68.354|
[ INFO : 2022-12-03 01:57:18,045 ] - | 4| 500| 0.0035435| 0| 1.656| 68.443|
[ INFO : 2022-12-03 01:58:09,385 ] - | 4| 600| 0.0033795| 0| 1.6529| 68.595|
[ INFO : 2022-12-03 01:59:00,758 ] - | 4| 700| 0.0032225| 0| 1.6503| 68.662|
[ INFO : 2022-12-03 01:59:51,819 ] - | 4| 800| 0.003072| 0| 1.6495| 68.682|
[ INFO : 2022-12-03 02:00:43,139 ] - | 4| 900| 0.002928| 0| 1.6468| 68.754|
[ INFO : 2022-12-03 02:01:34,505 ] - | 4| 1000| 0.0027902| 0| 1.6412| 68.873|
[ INFO : 2022-12-03 02:02:25,597 ] - | 4| 1100| 0.0026583| 0| 1.6359| 68.963|
[ INFO : 2022-12-03 02:03:17,058 ] - | 4| 1200| 0.0025321| 0| 1.6317| 69.055|
[ INFO : 2022-12-03 02:04:08,400 ] - | 4| 1300| 0.0024115| 0| 1.629| 69.129|
[ INFO : 2022-12-03 02:04:59,484 ] - | 4| 1400| 0.0022962| 0| 1.6249| 69.208|
[ INFO : 2022-12-03 02:05:50,800 ] - | 4| 1500| 0.0021861| 0| 1.6228| 69.271|
[ INFO : 2022-12-03 02:06:41,913 ] - | 4| 1600| 0.0020808| 0| 1.618| 69.359|
[ INFO : 2022-12-03 02:07:33,242 ] - | 4| 1700| 0.0019803| 0| 1.6143| 69.44|
[ INFO : 2022-12-03 02:08:24,553 ] - | 4| 1800| 0.0018844| 0| 1.6126| 69.504|
[ INFO : 2022-12-03 02:09:15,598 ] - | 4| 1900| 0.0017927| 0| 1.6088| 69.577|
[ INFO : 2022-12-03 02:10:06,053 ] - | 4| 2000| 0.0017053| 0| 1.6046| 69.666|
[ INFO : 2022-12-03 02:10:14,415 ] - | 4| 2016| 0.0016917| 0| 1.6042| 69.674|
[ INFO : 2022-12-03 02:11:16,613 ] - | 5| 100| 0.0015985| 0| 1.5455| 70.715|
[ INFO : 2022-12-03 02:12:09,077 ] - | 5| 200| 0.00152| 0| 1.5492| 70.646|
[ INFO : 2022-12-03 02:13:00,238 ] - | 5| 300| 0.0014452| 0| 1.5444| 70.784|
[ INFO : 2022-12-03 02:13:51,580 ] - | 5| 400| 0.0013738| 0| 1.536| 70.99|
[ INFO : 2022-12-03 02:14:42,665 ] - | 5| 500| 0.0013058| 0| 1.533| 71.066|
[ INFO : 2022-12-03 02:15:34,001 ] - | 5| 600| 0.001241| 0| 1.5275| 71.176|
[ INFO : 2022-12-03 02:16:25,308 ] - | 5| 700| 0.0011793| 0| 1.5256| 71.219|
[ INFO : 2022-12-03 02:17:16,473 ] - | 5| 800| 0.0011205| 0| 1.5217| 71.229|
[ INFO : 2022-12-03 02:18:07,716 ] - | 5| 900| 0.0010645| 0| 1.5174| 71.333|
[ INFO : 2022-12-03 02:18:59,053 ] - | 5| 1000| 0.0010111| 0| 1.5115| 71.433|
[ INFO : 2022-12-03 02:19:50,090 ] - | 5| 1100|0.00096037| 0| 1.5098| 71.462|
[ INFO : 2022-12-03 02:20:41,329 ] - | 5| 1200|0.00091203| 0| 1.5061| 71.548|
[ INFO : 2022-12-03 02:21:32,595 ] - | 5| 1300|0.00086603| 0| 1.5033| 71.611|
[ INFO : 2022-12-03 02:22:23,702 ] - | 5| 1400|0.00082225| 0| 1.5001| 71.646|
[ INFO : 2022-12-03 02:23:15,028 ] - | 5| 1500| 0.0007806| 0| 1.4983| 71.666|
[ INFO : 2022-12-03 02:24:06,203 ] - | 5| 1600|0.00074098| 0| 1.4958| 71.714|
[ INFO : 2022-12-03 02:24:57,441 ] - | 5| 1700| 0.0007033| 0| 1.4946| 71.72|
[ INFO : 2022-12-03 02:25:48,705 ] - | 5| 1800|0.00066746| 0| 1.4937| 71.718|
[ INFO : 2022-12-03 02:26:39,789 ] - | 5| 1900|0.00063339| 0| 1.4915| 71.762|
[ INFO : 2022-12-03 02:27:30,228 ] - | 5| 2000|0.00060099| 0| 1.4908| 71.767|
[ INFO : 2022-12-03 02:27:38,574 ] - | 5| 2016|0.00059596| 0| 1.4909| 71.766|
[ INFO : 2022-12-03 02:28:40,379 ] - | 6| 100|0.00056156| 0| 1.4484| 72.684|
[ INFO : 2022-12-03 02:29:32,842 ] - | 6| 200|0.00053272| 0| 1.4583| 72.447|
[ INFO : 2022-12-03 02:30:23,985 ] - | 6| 300|0.00050531| 0| 1.4556| 72.465|
[ INFO : 2022-12-03 02:31:15,241 ] - | 6| 400|0.00047927| 0| 1.4564| 72.448|
[ INFO : 2022-12-03 02:32:06,305 ] - | 6| 500|0.00045454| 0| 1.4566| 72.499|
[ INFO : 2022-12-03 02:32:57,650 ] - | 6| 600|0.00043104| 0| 1.4595| 72.471|
[ INFO : 2022-12-03 02:33:48,963 ] - | 6| 700|0.00040872| 0| 1.4595| 72.474|
[ INFO : 2022-12-03 02:34:40,021 ] - | 6| 800|0.00038752| 0| 1.4566| 72.477|
[ INFO : 2022-12-03 02:35:31,300 ] - | 6| 900|0.00036739| 0| 1.4509| 72.57|
[ INFO : 2022-12-03 02:36:22,623 ] - | 6| 1000|0.00034828| 0| 1.4483| 72.57|
[ INFO : 2022-12-03 02:37:13,663 ] - | 6| 1100|0.00033013| 0| 1.4503| 72.549|
[ INFO : 2022-12-03 02:38:04,983 ] - | 6| 1200|0.00031291| 0| 1.449| 72.59|
[ INFO : 2022-12-03 02:38:56,273 ] - | 6| 1300|0.00029656| 0| 1.4494| 72.586|
[ INFO : 2022-12-03 02:39:47,406 ] - | 6| 1400|0.00028105| 0| 1.4498| 72.564|
[ INFO : 2022-12-03 02:40:38,807 ] - | 6| 1500|0.00026632| 0| 1.4479| 72.603|
[ INFO : 2022-12-03 02:41:29,931 ] - | 6| 1600|0.00025235| 0| 1.4492| 72.579|
[ INFO : 2022-12-03 02:42:21,223 ] - | 6| 1700| 0.0002391| 0| 1.4483| 72.625|
[ INFO : 2022-12-03 02:43:12,508 ] - | 6| 1800|0.00022652| 0| 1.4471| 72.641|
[ INFO : 2022-12-03 02:44:03,550 ] - | 6| 1900|0.00021459| 0| 1.4464| 72.655|
[ INFO : 2022-12-03 02:44:54,014 ] - | 6| 2000|0.00020328| 0| 1.4468| 72.662|
[ INFO : 2022-12-03 02:45:02,359 ] - | 6| 2016|0.00020152| 0| 1.4464| 72.674|
[ INFO : 2022-12-03 02:45:02,428 ] - +----------+----------+----------+----------+----------+----------+
Namespace(dst_model='exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6/models/avg_model.pt', src_path='exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6/models', num=2, min_epoch=0, max_epoch=65536)
['exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6/models/model_5.pt', 'exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6/models/model_6.pt']
Processing exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6/models/model_5.pt
Processing exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6/models/model_6.pt
Saving to exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6/models/avg_model.pt
Extract embeddings ...
extract_embedding from /home/yoos/Documents/data/cnceleb_train/raw.list, wavs_num: 519590
extract_embedding from /home/yoos/Documents/data/eval/raw.list, wavs_num: 18772
Fail eval
Success cnceleb_train
Embedding dir is (exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6/embeddings).
mean vector of enroll
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 196/196 [00:00<00:00, 70239.54it/s]
Score ...
apply cosine scoring ...
CNC-Eval-Concat.lst
Calculate mean statistics from exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch6/embeddings/cnceleb_train/xvector.scp.
scoring trial CNC-Eval-Concat.lst: 0%| | 4884/3484292 [00:00<02:59, 19340.65it/s]
Traceback (most recent call last):
File "/home/yoos/Documents/code/wespeaker/examples/cnceleb/v2/wespeaker/bin/score.py", line 96, in
sorry, Why is it successful to extract eval embedding in the first stage of training (without LM), but it fails to extract eval embedding in the LM stage? Is there any difference between them?
Hi, the log shows that the program did not successfully extract the embedding of the eval dataset. You should check 'exp/ResNet34-TSTP-emb256-fbank80-num_frms200-aug0.6-spTrue-saFalse-ArcMargin-SGD-epoch3-LM/embeddings/eval/log/**.log'. I guess it's an OOM error, becuase the num_frms is 600 in the stage of LM.
d00922-interview-05-003.
WOW, thank you!!! you are so awesome!
Traceback (most recent call last):
File "/home/yoos/Documents/code/wespeaker/examples/cnceleb/v2/wespeaker/bin/extract.py", line 93, in <module>
fire.Fire(extract)
File "/home/yoos/miniconda3/envs/wespeaker/lib/python3.9/site-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/yoos/miniconda3/envs/wespeaker/lib/python3.9/site-packages/fire/core.py", line 466, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/home/yoos/miniconda3/envs/wespeaker/lib/python3.9/site-packages/fire/core.py", line 681, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/home/yoos/Documents/code/wespeaker/examples/cnceleb/v2/wespeaker/bin/extract.py", line 46, in extract
model.to(device).eval()
File "/home/yoos/miniconda3/envs/wespeaker/lib/python3.9/site-packages/torch/nn/modules/module.py", line 899, in to
return self._apply(convert)
File "/home/yoos/miniconda3/envs/wespeaker/lib/python3.9/site-packages/torch/nn/modules/module.py", line 570, in _apply
module._apply(fn)
File "/home/yoos/miniconda3/envs/wespeaker/lib/python3.9/site-packages/torch/nn/modules/module.py", line 593, in _apply
param_applied = fn(param)
File "/home/yoos/miniconda3/envs/wespeaker/lib/python3.9/site-packages/torch/nn/modules/module.py", line 897, in convert
return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
RuntimeError: CUDA error: out of memory
CUDA kernel errors might be asynchronously reported at some other API call,so the stacktrace below might be incorrect.
For debugging consider passing CUDA_LAUNCH_BLOCKING=1.
How can I modify the configuration to avoid this situation? My GPU is 3090 24G
How can I modify the configuration to avoid this situation? My GPU is 3090 24G
Currently, the embedding of the cnceleb_train has been successfully extracted. You can try to modify the code in 'extract_cnc.sh' to extract only the embedding of eval.
How can I modify the configuration to avoid this situation? My GPU is 3090 24G
Currently, the embedding of the cnceleb_train has been successfully extracted. You can try to modify the code in 'extract_cnc.sh' to extract only the embedding of eval.
wow! thank you ! it work. But it is a bit a little cumbersome,if I want to they success together, How should I do?
wow! thank you ! it work. But it is a bit a little cumbersome,if I want to they success together, How should I do?
Maybe you can set batch_size_array=(8 1)
.
It doesn't work...Why do these two extraction processes take up gpu memory at the same time?
It doesn't work...Why do these two extraction processes take up gpu memory at the same time?
oh, I remove the '&' in 的 script, It work ! thank you very much !
Hi, sorry to bother you, It seem apply cos score encounter erro,but I didn't know how to solve it. File format is made by .sh file,It seems that there should be no file mismatch,,,,,,