Closed sixleaves closed 6 years ago
Read the data / setup
part of the readme, if you want to train on the cityscapes dataset you have to:
h5
file Example dataframes is provided in the data/
directory. directories.train
in config.py
to point at the saved dataframe.By the way, if i want to train on the dataset that myself's data.how i could do?
what's your data structure, i don't know how to create it
f = h5py.File("./data/mytest.h5", "w")
grp = f.create_group("df")
df = pd.DataFrame(["1.png", "2.png"], columns=('path'))
grp.create_dataset(name="path", data=df)
these has two method i write can auto make the h5 file.
@staticmethod
def list_dir(rootDir, extensions):
if rootDir.endswith("/"):
rootDir = rootDir.rstrip("/")
filenames = os.listdir(rootDir)
filePaths = []
for filename in filenames:
for extension in extensions:
if filename.endswith(extension):
filePaths.append(rootDir + "/"+ filename)
break
return filePaths
@staticmethod
def write_filePaths_to_H5(dir, file_extensions, h5_file_name):
filePaths = Data.list_dir(dir, file_extensions)
filePaths = np.array(filePaths)
b = pd.DataFrame({"path": filePaths})
h5 = pd.HDFStore(h5_file_name, 'w')
h5['df'] = b
h5.close()
If you want to train on your personal dataset then just do steps 3 and 4 above. You will need to resize each image so the dimensions are a multiple of 16 because the encoder produces a feature map of H/16 X W/16 X C={2,4,8,16} from a [H, W, C=channels] image.
The easiest way to create a dataframe holding the path names would be to glob the files
files = glob.glob(/path/to/your/dataset/*{extension})
then call the Dataframe constructor in Pandas, name the single column 'path', then use df.to_hdf
.
So how do I separate training and testing datasets, I don't know what this principle is, I don't care, but I want to know how to separate? I can directly compress the resources in your game project according to your rules
What is the compressed image format?
here is error message, i did't know how to solve