microsoft / AutonomousDrivingCookbook

Scenarios, tutorials and demos for Autonomous Driving
MIT License
2.32k stars 566 forks source link

Raise Stop Iteration in the next step of DataExplorationAndPreparation in AirSimE2EDeepLearning #94

Closed iAmCorey closed 5 years ago

iAmCorey commented 5 years ago

Problem description

When run the last code block in the DataExplorationAndPreparation.ipynb, when Processing G:/Autonomous/Dataset/data_cooked/train.h5... (processing the train.h5 file), generator raised StopIteration. I have met this same problem in windows and linux. And there is only the train.h5 file in my data_cooked folder, no test file and thus I can't do the next step of this tutorial.

Problem details

The error output was like this:

Reading data from G:/Autonomous/Dataset/data_raw/normal_1...
Reading data from G:/Autonomous/Dataset/data_raw/normal_2...
Reading data from G:/Autonomous/Dataset/data_raw/normal_3...
Reading data from G:/Autonomous/Dataset/data_raw/normal_4...
Reading data from G:/Autonomous/Dataset/data_raw/normal_5...
Reading data from G:/Autonomous/Dataset/data_raw/normal_6...
Reading data from G:/Autonomous/Dataset/data_raw/swerve_1...
Reading data from G:/Autonomous/Dataset/data_raw/swerve_2...
Reading data from G:/Autonomous/Dataset/data_raw/swerve_3...
Processing G:/Autonomous/Dataset/data_cooked/train.h5...
---------------------------------------------------------------------------
StopIteration                             Traceback (most recent call last)
G:\Autonomous\Tutorials\AirSimE2EDeepLearning\Cooking.py in generatorForH5py(data_mappings, chunk_size)
    129                 raise StopIteration
--> 130     raise StopIteration
    131 

StopIteration: 

The above exception was the direct cause of the following exception:

RuntimeError                              Traceback (most recent call last)
<ipython-input-11-bce568193587> in <module>
      1 train_eval_test_split = [0.7, 0.2, 0.1]
      2 full_path_raw_folders = [os.path.join(RAW_DATA_DIR, f) for f in DATA_FOLDERS]
----> 3 Cooking.cook(full_path_raw_folders, COOKED_DATA_DIR, train_eval_test_split)

G:\Autonomous\Tutorials\AirSimE2EDeepLearning\Cooking.py in cook(folders, output_directory, train_eval_test_split)
    196         for i in range(0, len(split_mappings), 1):
    197             print('Processing {0}...'.format(output_files[i]))
--> 198             saveH5pyData(split_mappings[i], output_files[i])
    199             print('Finished saving {0}.'.format(output_files[i]))

G:\Autonomous\Tutorials\AirSimE2EDeepLearning\Cooking.py in saveH5pyData(data_mappings, target_file_path)
    162         dset_previous_state[:] = previous_state_chunk
    163 
--> 164         for image_names_chunk, label_chunk, previous_state_chunk in gen:
    165             image_chunk = np.asarray(readImagesFromPath(image_names_chunk))
    166 

RuntimeError: generator raised StopIteration

Experiment/Environment details

mitchellspryn commented 5 years ago

This should fix it.

TL;DR: this is a change in python 3.7 in how it handles generators. Previously, an uncaught StopIteration doesn't cause the program to terminate. Now it does. There are two ways to fix it:

1) Downgrade python to <3.7. 2) Wrap the generator in a try/catch

DaHaiHuha commented 5 years ago

facing the same issues now

amaalo2 commented 4 years ago

So more specifically, in Cooking.py, the original line 164 in saveH5pyData which was

    for image_names_chunk, label_chunk, previous_state_chunk in gen:
        image_chunk = np.asarray(readImagesFromPath(image_names_chunk))

        # Resize the dataset to accommodate the next chunk of rows
        dset_images.resize(row_count + image_chunk.shape[0], axis=0)
        dset_labels.resize(row_count + label_chunk.shape[0], axis=0)
        dset_previous_state.resize(row_count + previous_state_chunk.shape[0], axis=0)
        # Write the next chunk
        dset_images[row_count:] = image_chunk
        dset_labels[row_count:] = label_chunk
        dset_previous_state[row_count:] = previous_state_chunk

        # Increment the row count
        row_count += image_chunk.shape[0]

Now becomes

    try:
            for image_names_chunk, label_chunk, previous_state_chunk in gen:
            image_chunk = np.asarray(readImagesFromPath(image_names_chunk))

            # Resize the dataset to accommodate the next chunk of rows
            dset_images.resize(row_count + image_chunk.shape[0], axis=0)
            dset_labels.resize(row_count + label_chunk.shape[0], axis=0)
            dset_previous_state.resize(row_count + previous_state_chunk.shape[0], axis=0)
            # Write the next chunk
            dset_images[row_count:] = image_chunk
            dset_labels[row_count:] = label_chunk
            dset_previous_state[row_count:] = previous_state_chunk

            # Increment the row count
            row_count += image_chunk.shape[0]
    except: StopIteration
abinezer commented 3 years ago

The above-proposed solution by @amaalo2 didn't change the error.

`StopIteration                             Traceback (most recent call last)
~/Big_Data_Project/Cooking.py in generatorForH5py(data_mappings, chunk_size)
    126                 raise StopIteration
--> 127     raise StopIteration
    128 

StopIteration: 

The above exception was the direct cause of the following exception:

RuntimeError                              Traceback (most recent call last)
<ipython-input-28-1146e5d07678> in <module>
----> 1 Cooking.cook(full_path_raw_folders, COOKED_DATA_DIR, train_eval_test_split)

~/Big_Data_Project/Cooking.py in cook(folders, output_directory, train_eval_test_split)
    196         for i in range(0, len(split_mappings), 1):
    197             print('Processing {0}...'.format(output_files[i]))
--> 198             saveH5pyData(split_mappings[i], output_files[i])
    199             print('Finished saving {0}.'.format(output_files[i]))

~/Big_Data_Project/Cooking.py in saveH5pyData(data_mappings, target_file_path)
    160     try:
    161         for image_names_chunk, label_chunk, previous_state_chunk in gen:
--> 162                 image_chunk = np.asarray(readImagesFromPath(image_names_chunk))
    163 
    164                 # Resize the dataset to accommodate the next chunk of rows

RuntimeError: generator raised StopIteration`

I did add a try, except

try:
        for image_names_chunk, label_chunk, previous_state_chunk in gen:
            image_chunk = np.asarray(readImagesFromPath(image_names_chunk))

            # Resize the dataset to accommodate the next chunk of rows
        dset_images.resize(row_count + image_chunk.shape[0], axis=0)
        dset_labels.resize(row_count + label_chunk.shape[0], axis=0)
        dset_previous_state.resize(row_count + previous_state_chunk.shape[0], axis=0)
        # Write the next chunk
        dset_images[row_count:] = image_chunk
        dset_labels[row_count:] = label_chunk
        dset_previous_state[row_count:] = previous_state_chunk

        # Increment the row count
        row_count += image_chunk.shape[0]
    except: StopIteration

Would like to know if I'm missing something. I am using this for a project I am building at college and would like some help on getting past this error.

abinezer commented 3 years ago

For those who encounter the problem I've posted above, I have found a workaround. I did some reading and found that the problem is with Python 3.7. So I decided to create a separate environment for this project and installed python 3.5 in that environment. It should work for you after that.