Open NgoJunHaoJason opened 3 years ago
Why is it that the images are normalised this way: https://github.com/sirius-ai/LPRNet_Pytorch/blob/7c976664b3f3879efabeaff59c7a117e49d5f29e/data/load_data.py#L63-L64
I thought the usual way of normalising is to minus 127.5 and divide by 127.5, so that the pixel values are in the range of [-1, 1]? By minusing 127.5 and dividing by 128, the pixel values are in the range of [-0.99609375, 0.99609375].
Why is it that the images are normalised this way: https://github.com/sirius-ai/LPRNet_Pytorch/blob/7c976664b3f3879efabeaff59c7a117e49d5f29e/data/load_data.py#L63-L64
I thought the usual way of normalising is to minus 127.5 and divide by 127.5, so that the pixel values are in the range of [-1, 1]? By minusing 127.5 and dividing by 128, the pixel values are in the range of [-0.99609375, 0.99609375].