Open akmalkadi opened 3 years ago
I added two updates to the question. I hope someone here is reading what I am writing
Update3: I scaled the images from 1900x17 to 1900x27 and it starts the training without error messages. Is there any reason why less than 27 pixels doesn't work? Is there anything I can do rather than the scaling to solve this issue? If the scaling works for me, what the justification that I can think about for this step?
thanks
Greetings,
I have an issue with the training but first I would like to ask for clarification about the following: "Having stored our cropped images of equal sizes in a different directory.."
Do you mean by "equal sizes" that all images should have the same width and height (Squares)? or is it okay to be rectangles but all images should have the same sizes?
What makes me confused is that all the images you are using in the example are squares (200x200). Also, after making my dataset with equal sizes rectangles (1900x17). I got the following error during the training step :
Please help me to solve this issue. Thank you
Update1:
The error is coming from this line in model.py:
net, _ = inception.inception_v3_base(images, final_endpoint=mparams.final_endpoint)
I tried to debug the object images and got:
Output:
Update2: I just realized that my issue because the height in number_plates.py is less than 26. I tried 26 and more but I don't get the same error. In my case, the height is 17 and mostly will be less than 26. Any idea how to change the limit?