hippylib / hippyflow

Dimension reduced surrogate construction for parametric PDE maps
Other
36 stars 6 forks source link
active-subspace deep-learning dimension-reduction dimensionality-reduction karhunen-loeve neural-networks parametric-pde parametric-pdes proper-orthogonal-decomposition reduced-basis reduced-order-modeling reduced-order-surrogate-model surrogate-models
    Dimension reduced surrogate construction for parametric PDE maps

          ___                       ___         ___               
         /__/\        ___          /  /\       /  /\        ___   
         \  \:\      /  /\        /  /::\     /  /::\      /__/|  
          \__\:\    /  /:/       /  /:/\:\   /  /:/\:\    |  |:|  
      ___ /  /::\  /__/::\      /  /:/~/:/  /  /:/~/:/    |  |:|  
     /__/\  /:/\:\ \__\/\:\__  /__/:/ /:/  /__/:/ /:/   __|__|:|  
     \  \:\/:/__\/    \  \:\/\ \  \:\/:/   \  \:\/:/   /__/::::\  
      \  \::/          \__\::/  \  \::/     \  \::/       ~\~~\:\ 
       \  \:\          /__/:/    \  \:\      \  \:\         \  \:\
        \  \:\         \__\/      \  \:\      \  \:\         \__\/
         \__\/                     \__\/       \__\/              

              ___                       ___           ___     
             /  /\                     /  /\         /__/\    
            /  /:/_                   /  /::\       _\_ \:\   
           /  /:/ /\  ___     ___    /  /:/\:\     /__/\ \:\  
          /  /:/ /:/ /__/\   /  /\  /  /:/  \:\   _\_ \:\ \:\ 
         /__/:/ /:/  \  \:\ /  /:/ /__/:/ \__\:\ /__/\ \:\ \:\
         \  \:\/:/    \  \:\  /:/  \  \:\ /  /:/ \  \:\ \:\/:/
          \  \::/      \  \:\/:/    \  \:\  /:/   \  \:\ \::/ 
           \  \:\       \  \::/      \  \:\/:/     \  \:\/:/  
            \  \:\       \__\/        \  \::/       \  \::/   
             \__\/                     \__\/         \__\/    

Build Status DOI License Top language Code size Issues Latest commit

Model Based Projectors:

hIPPYflow implements software infrastructure for input and output dimension reduction strategies for parametric mappings governed by PDEs. Given a parametric PDE Variational Problem implemented in hIPPYlib (using FEniCS for finite element representation), and a PDE observable, this code automates the construction of dominant subspaces of the input and output for these mappings.

hIPPYflow implements both active subspace (AS) and Karhunen Loeve expansion (KLE) for input dimension reduction. hIPPYflow implements proper orthogonal decomposition (POD) for output dimension reduction.

AS computes the dominant eigenvalue-eigenvector pairs of the following operator:

KLE computes the dominant eigenvalue-eigenvector pairs of the covariance of the parameter distribution

POD computes the dominant eigenvalue-eigenvector pairs of the expectation of the data outer-product matrix:

These constructs also implement the generation of training data to be used in surrogate construction, as well as projection error tests that exemplify how good the different model projectors are at capturing key information, and help to detect the "intrinsic dimensionality" of the mappings from inputs to outputs.

Example Usage (reduced basis construction)

import dolfin as dl
import ufl
import numpy as np
import sys, os
sys.path.append(os.environ.get('HIPPYLIB_PATH'))
import hippylib as hp

sys.path.append(os.environ.get('HIPPYFLOW_PATH'))
import hippyflow as hf

# Set up PDE Variational Problem and observable using a function
def build_observable(mesh, **kwargs):
    # Set up the PDE problem in hIPPYlib
    rank = dl.MPI.rank(mesh.mpi_comm())         
    Vh2 = dl.FunctionSpace(mesh, 'Lagrange', 2)
    Vh1 = dl.FunctionSpace(mesh, 'Lagrange', 1)
    Vh = [Vh2, Vh1, Vh2]
    # Initialize Expressions
    f = dl.Constant(0.0)

    def u_boundary(x,on_boundary):
        return on_boundary

    u_bdr = dl.Expression("x[1]", degree=1)
    u_bdr0 = dl.Constant(0.0)
    bc = dl.DirichletBC(Vh[hp.STATE], u_bdr, u_boundary)
    bc0 = dl.DirichletBC(Vh[hp.STATE], u_bdr0, u_boundary)

    def pde_varf(u,m,p):
        return ufl.exp(m)*ufl.inner(ufl.grad(u), ufl.grad(p))*ufl.dx - f*p*ufl.dx

    pde = hp.PDEVariationalProblem(Vh, pde_varf, bc, bc0, is_fwd_linear=True)

    # Instance observable operator (in this case pointwise observation of state)
    x_targets = np.linspace(0.1,0.9,10)
    y_targets = np.linspace(0.1,0.9,10)
    targets = []
    for xi in x_targets:
        for yi in y_targets:
            targets.append((xi,yi))
    targets = np.array(targets)

    B = hp.assemblePointwiseObservation(Vh[hp.STATE], targets)
    return hf.LinearStateObservable(pde,B)

# Set up mesh
ndim = 2
nx = 10
ny = 10
mesh = dl.UnitSquareMesh(nx, ny)
# Instance observable
observable_kwargs = {} # No kwargs given in this example
observable = build_observable(mesh,**observable_kwargs)

# Instance probability distribution for the parameter
prior = hp.BiLaplacian2D(observable.problem.Vh[hp.PARAMETER],gamma = 0.1, delta = 1.0)

# Instance Active Subspace Operator
AS = hf.ActiveSubspaceProjector(observable,prior)
# Compute and save input reduced basis to file:
AS.construct_input_subspace()

# Instance POD Operator to compute POD basis and training data
POD = hf.PODProjector(observable,prior)
POD.construct_subspace()
output_directory = 'location/for/training/data/'
POD.generate_training_data(output_directory)

Dimension Reduced Neural Network Strategies

Derivative Informed Projected Neural Networks (DIPNets)

References

These publications use the hippyflow library