Open LeeDoYup opened 4 years ago
I tried to use MS-COCO dataset for training. After i serialize the MS-COCO image using
img = tf.gfile.FastGFile(img_path, 'rb').read() or img = open(img_path, 'rb').read()
and save tfrecords,
i always meets an error, the shape of decoded image is not corresponded to its original shape.
When i change this tf.decode_raw into tf.io.decode_image i solve the problem. https://github.com/openai/glow/blob/eaff2177693a5d84a1cf8ae19e8e0441715b82f8/data_loaders/get_data.py#L19
tf.decode_raw
tf.io.decode_image
I manually check that bytes files have smaller size than the original size (hxwxc) What is the reason???
I tried to use MS-COCO dataset for training. After i serialize the MS-COCO image using
and save tfrecords,
i always meets an error, the shape of decoded image is not corresponded to its original shape.
When i change this
tf.decode_raw
intotf.io.decode_image
i solve the problem. https://github.com/openai/glow/blob/eaff2177693a5d84a1cf8ae19e8e0441715b82f8/data_loaders/get_data.py#L19I manually check that bytes files have smaller size than the original size (hxwxc) What is the reason???