jakeret / tf_unet

Generic U-Net Tensorflow implementation for image segmentation
GNU General Public License v3.0
1.9k stars 748 forks source link

How to run this project with jpg files? #248

Closed Ekinkit closed 5 years ago

Ekinkit commented 5 years ago

I am new in the machine learning and sorry for my such easy problem. I have read the files showed in the github and find that many people use .tif files as their input. Here is my problem: How can I use image_gen.py to create my own train/val dataset with .jpg/.png images. Need I transform them into another form? Can someone provide me with ideas to start training?

jakeret commented 5 years ago

Have a look at the documentation of the ImageDataProvider.

brandries commented 5 years ago

I have found it easiest to do a quick conversion from whatever format your images are in to tif. You can in the process also then preprocess your images. However, as in the documentation, you can specify your format as long as it is readable by Pillow.

Ekinkit commented 5 years ago

I have found it easiest to do a quick conversion from whatever format your images are in to tif. You can in the process also then preprocess your images. However, as in the documentation, you can specify your format as long as it is readable by Pillow.

Thank you so much~

soroushr commented 5 years ago

@Ekinkit I don't know if you were successful with jpg, but here is my experience: My training images were jpg and labels were VGG JSON files. I converted the JSON labels to jpg format, but when I zoomed into the jpg labels as much as possible, I found that the conversion of JSON mask to binary jpg was not precisely binary - it had several grey pixels (instead of simply black and white). So I had to convert everything to .png and all looked good. I think I will avoid JPG from now on!

Ekinkit commented 5 years ago

@soroushr Thank you for your reply. I have successfully trained the model with .jpg input files while I used keras. https://github.com/ShawDa/unet-rgb this repo gives me some inspirations.