This is the implementation for the paper "Automatic Generation of Dense Non-rigid Optical Flow"
The dataset DMO is available at here
Please note the large file size: 59GB (zipped) and 132 GB (unzipped)
The sub-directories include:
D15OM/
|---fd1
|---|---img1
|---|---|---(vid1)
|---|---img2
|---fd2
|---...
|---fd5
D15RM/
Flow/
|---fd1
|---|---(vid1)
|---fd2
|---...
|---fd5
D15OM_list.txt
D15RM_list.txt
As mentioned in the paper, D15RM
and D15OM
contains the same images with different textures, with D15OM having random textures, and hence the Flow
data are shared between the 2 sets.
The 3 directories D15OM
, D15RM
and Flow
have similar structures as shown above, with Flow
having 1 level short. The list files contain the images-flow correspondences.
Start off by cloning the repository
git clone https://github.com/lhoangan/arap_flow.git
cd arap_flow
export ARAP_ROOT=$PWD
The ARAP image deformation used in this repository is adapted from the implementation provided with the Opt language.
The requirement is
Download terra corresponding to your system and place it in the working folder.
cd $ARAP_ROOT && \
# Download terra 2016-03-25 for Linux
wget https://github.com/terralang/terra/releases/download/release-2016-03-25/terra-Linux-x86_64-332a506.zip && \
unzip terra-Linux-x86_64-332a506.zip && \
mv terra-Linux-x86_64-332a506 arap_flow/terra && \
rm terra-Linux-x86_64-332a506.zip
Opt expects CUDA to be at /usr/local/cuda
, if you are using a different
directory, update $CUDA_HOME
(for Linux) and $CUDA_PATH
(for Windows).
The Linux command is shown below.
> export CUDA_HOME=/usr/local/cuda-7.5
Install clang with sudo
previledge by running
sudo apt-get install clang
or, to use Anaconda environment, running:
conda install -c statiskit clang # v3.8.1
Download the latest version of deep matching and follow the provided instruction to compile it accordingly.
For your convenience, we provide a downloading script in ./deepmatching
folder,
for CPU, Version 1.2.2 (October 19th, 2015).
Run by using the following commands
cd $ARAP_ROOT/deepmatching && \
chmod +x get_deepmatching.sh && \
./get_deepmatching.sh
To build, simply run make
cd $ARAP_ROOT/deepmatching/deepmatching_1.2.2_c++ && make
Simply run make
in 3 folders, namely API
, deformation
, and warping
cd $ARAP_ROOT/ARAP/API && make
cd $ARAP_ROOT/ARAP/deformation && make
cd $ARAP_ROOT/ARAP/warping && make
python para_gen.py --multseg --input data/DAVIS --output data/DAVIS/test --fd 2
python para_gen.py --gpu 0 1 2 3 --input data/DAVIS/ --output data/DAVIS/fd3 --fd 3 --size 854 480 --multseg 2>&1 | tee DAVIS2.log
Flags:
If you find this implementation useful and have applied for your research, please consider citing this paper
@misc{LeARAP2018,
author = {Hoang-An Le and Tushar Nimbhorkar and Thomas Mensink and Sezer Karaoglu and Anil S. Baslamisli and Theo Gevers},
title = {Unsupervised Generation of Optical Flow Datasets for Videos in the Wild},
year = {2018},
eprint = {arXiv:1812.01946}
}
This implementation is based on several work