Closed IanPhilips closed 3 years ago
@IanPhilips The ord map file maintains a map relationship between exact character and a int label which i think is not totally the same as lexicons you mentioned here. For english only 26 alphabets are labled so the search space can not be too large:)
Yes, I was thinking more about the word mappings rather than the character mappings. Your neural net learns that certain strings of letters from the recurrent layers matched to certain words on the transcription layer. If I were to use the pretrained model to identify something like a serial number on a product, ( a random string of letters and numbers), it would not do as well because it's looking for words in the lexicon, right?
@IanPhilips It will do as well. The ord map only contains single character rather than words or something. Besides I have used it to do verification code recognition which was composed of random numbers:)
Cool, I'll mess around with that then, thanks!
Hi! Thank you for the wonderful repo.
I'm curious, in your models case it looks like supplying a lexicon at runtime would not work as the CRNN learns to map sequences to lexicon labels during training. Are you familiar with any methods that would allow the lexicon to be specified dynamically at each forward pass of the neural network? I'd like to change the lexicon available to the network every time I run it. I can't specify the entire lexicon as the search space would be too large.
thanks!