HawkAaron / E2E-ASR

PyTorch Implementations for End-to-End Automatic Speech Recognition
126 stars 27 forks source link

Why we use vocab_size = 62? #15

Closed PeiyanFlying closed 4 years ago

PeiyanFlying commented 4 years ago

Why we use vocab_size = 62? Not vocab_size = 10000, the same as RNN-Transducer paper? Are some reasons in speech data processing? Many thanks~