hprop / mcv-m5

Master in Computer Vision - M5 Visual recognition
0 stars 0 forks source link

MCV-M5 : Scene Understanding for Autonomous Vehicles

This is the PreDeeptor (Team 8) repository for the M5 project. Here you can find the source code, the documents, the deliverables and the instructions to run the code for each week, and some references that we use for the project.

Abstract

Convolutional Neural Networks are a hot topic at this moment. On the other hand, autonomous driving is currently a worry for the society. The current project focuses on implementation and evaluation of deep Convolutional Neural Networks in Object Recognition, Object Detection and Semantic Segmentatation on traffic images.

Contributors

We are PreDeeptor:

Documents

Development

Week 1. Project presentation

Instructions to run the code

There's no implemented code this week.

Week 2. Object recognition

Code explained

From the original repository we just worked with the config file and we added 2 models, Resnet and DenseNet.

Resnet

We followed the original paper.

DenseNet

We followed the original paper. The implementations of tdeboissiere, robertomest and titu1994 guided ours. We also added bottleneck and compression algorithms, introduced in the papers.

Achievements

Instructions to run the code

To make a test of the experiment corresponding to the config file experimentX in the code/config folder on the repository and save the results in /home/master/folderX, if you have the datasets in /home/master/datasets_folder:

python train.py -c config/experimentX.py -e ~/folderX -s /data/module5 -l ~/datasets_folder/

Weights

On the folder below, you can access to the folder which stores the weights of each model.

Mirror

Week 3 & 4. Object detection

Achievements

Code explained

YOLO

We modified the global contrast normalization (GCN) provided in the framework since it appears broken due to the introduction of a mask array to handle void labels (for semantic segmentation). GCN was one of the preprocessing stages used in our experiments with the YOLO architecture.

Contributions were also done in the eval_detection_fscore script to add the preprocessing stages used (samplewise center, std normalization, GCN).

SSD

Our implementation is based on the code from the rykov8's repository.

Beyond some modifications to adapt the input and output bounding box formats to those used in our framework, our major contribution was to decouple the base model from the priors declaration and the construction of the prediction layers. Thus we are able to build easily new SSD topologies with the build_ssd() function (see models/ssd.py).

We plan to add in further contributions (out of assignment) a SSD architecture with a resnet base model.

Modifications on the framework

Instructions to run the code

To make a test of the experiment corresponding to the config file experimentX in the code/config folder on the repository and save the results in /home/master/folderX, if you have the datasets in /home/master/datasets_folder:

python train.py -c config/experimentX.py -e ~/folderX -s /data/module5 -l ~/datasets_folder/

To evaluate the f-score of the model generated by the previous experiment:

python eval_detection_fscore.py ~/folderX/weights.hdf ~/datasets_folder

Weights

On the folder below, you can access to the folder which stores the weights of each model.

Mirror

Week 5 & 6. Object segmentation

Code explained

From the original repository we made some modifications on the framework, worked with the config files and added one model, Tiramisu.

Tiramisu

We followed the original paper. We also based our model on SimJeg's implementation. This was implemented in Lasagne, we implemented it in Keras.

To solve some missmatches we do Zero Padding after deconvolutional layers (to concatenate with the skip connections). Bottleneck and compression algorithms are implemented. We also implemented eval_dataset.py

Modifications on the framework

Achievements

Instructions to run the code

To make a test of the experiment corresponding to the config file experimentX in the code/config folder on the repository and save the results in /home/master/folderX, if you have the datasets in /home/master/datasets_folder:

python train.py -c config/experimentX.py -e ~/folderX -s /data/module5 -l ~/datasets_folder/

Weights

On the folder below, you can access to the folder which stores the weights of each model.

Mirror

References