Open MarieJuhlJ opened 1 month ago
Using model trained on task 2 level 1 with linear filtering, but evaluating on task 2 level 2 without linear filtering gave a CER score of 0.472 (just below the CER score of the recordings which is 0.474).
Trained for 50 epochs on aligned data with extra synth t1l1 data gave a CER score of 0.483 and a CER score of 0.444 when evaluated with the linear filter first.
Training on all of the data of task 2 with IR and extra synthesized data per level for 50 epochs gives a CER score of 0.506 on task 2 level 2.
Training on all of the data of task 2 with IR for 50 epochs gives a CER score of 0.504 on task 2 level 2.
Training on IR filtered data from task 2 level 2 gives a CER score of 0.867 when evaluating with linear filter and 0.845 without linear filter. It seems to overfit the training data quite a lot.
Using a simple regularization with package cvxpy I get mean CER: 0.23 compared to 0.4436 without regularization the model is a bit slower than we are used to with Mean Real-Time Factor: 0.5811373373823896
DCCRN finetuned for 30 epochs on task_2_level_2 on aligned data had CER of 0.557
DCCRN finetuned for 30 epochs on all of task 2 data had a mean CER score on task_2_level_2 of 0.557