This repository contains codes and examples of the data interpolation schemes that leverages the deep prior paradigm.
Fantong Kong1, Francesco Picetti2, Vincenzo Lipari2, Paolo Bestagini2, Xiaoming Tang1, and Stefano Tubaro2
1: School of Geosciences - China University of Petroleum (East), Qingdao, China
2: Dipartimento di Elettronica, Informazione e Bioingegneria - Politecnico di Milano, Italy
Irregularity and coarse spatial sampling of seismic data strongly affect the performances of processing and imaging algorithms. Therefore, interpolation is a necessary pre-processing step in most of the processing workflows. In this work, we propose a seismic data interpolation method based on the deep prior paradigm: an ad-hoc Convolutional Neural Network is used as a prior to solve the interpolation inverse problem, avoiding any costly and prone-to-overfitting training stage. In particular, the proposed method leverages a multi resolution U-Net with 3D convolution kernels exploiting correlations in 3D seismic data, at different scales in all directions. Numerical examples on different corrupted synthetic and field datasets show the effectiveness and promising features of the proposed approach.
The inverse problem is defined starting from the sampling equation:
that is solved using the deep prior paradigm by
The estimate of the true model is then obtained as the output of the optimized network:
The architecture we propose is the MultiResolution UNet:
The code mainly relies on top of pytorch. You can recreate our conda environment named dpi
(acronym for "deep prior interpolation") through
conda create env -f environment.yml
Then, activate it with source activate dpi
before running any example.
NOTE: if you have initialized conda through conda init
, use conda activate dpi
instead.
This python project is organized as follows:
main.py
is the main script that actually does the interpolationparameter.py
contains the run options that the main will parse as shell arguments. Check it out!architectures
contains pytorch implementations of the networks and loss functionsdata.py
contains data management utilities, such as data patch extraction.utils
contains some general purpose utilitiesHere we report the example tests on the 3D hyperbolic data included in the paper.
# solve mask 1 saving the CNN weights
python main.py --imgdir ./datasets/hyperbolic3d --imgname original.npy --maskname random66_shot1.npy --datadim 3d --gain 40 --upsample linear --epochs 3000 --savemodel --outdir TL/shot1
# solve mask 2 from scratch
python main.py --imgdir ./datasets/hyperbolic3d --imgname original.npy --maskname random66_shot2.npy --datadim 3d --gain 40 --upsample linear --epochs 3000 --outdir TL/shot2_scratch
# solve mask 2 using as initial guess the CNN weights of mask 1
python main.py --imgdir ./datasets/hyperbolic3d --imgname original.npy --maskname random66_shot2.npy --datadim 3d --gain 40 --upsample linear --epochs 3000 --outdir TL/shot2_transfer --net load --netdir TL/shot1/model.pth
We are glad you want to try our method on your data! To minimize the effort, keep in mind that:
data.reconstruct_patches
.--gain
for avoiding numerical errors.