malcolmw / pykonal

Travel-time calculator based on the fast-marching method solution to the Eikonal equation.
https://malcolmw.github.io/pykonal-docs/
GNU General Public License v3.0
147 stars 54 forks source link

synthetic surface waves..? #39

Closed filefolder closed 11 months ago

filefolder commented 11 months ago

Hi Malcolm,

I've been toying around with the idea of using pykonal to emulate a surface wave "ray path" by artificially restricting the depth of the model until the travel time matches those observed via ambient noise analysis (one raypath per discrete period). The ultimate idea being to collect these for use in say, pyvorotomo for a proper 3D joint inversion.

The setup and pre-processing will take a lot of time, but it seems...doable? So far I have been able to match pykonal predicted times with generalized interstation distances / phase velocities reasonably well which is giving me some hope. It appears you may have given some thought to this way back when, wondering if you have any future plans or ideas or comments (or warnings!) as to the feasibility of this nowadays?

FWIW here are the helper functions I've come up with thus far, but I'm sure they could be optimized. Perhaps of use to other people for other reasons?

def trim_model(velocity_model,maxdepth=10):
    #get info from old model
    maxgeocoords = sph2geo(velocity_model.max_coords) #the surface (lat,lon,z)
    mingeocoords = sph2geo(velocity_model.min_coords) #at depth
    surface_depth = maxgeocoords[2]
    maxcoords = velocity_model.max_coords
    mincoords = velocity_model.min_coords #spherical coordinates are (z/rho,theta,phi)
    nz = velocity_model.npts[0]
    dz = (mingeocoords[2] - maxgeocoords[2])/nz
    prev_size = velocity_model.npts #(z/rho, theta, phi)
    prev_intervals = velocity_model.node_intervals #(z/rho, theta, phi)

    #get info for new model
    new_size = prev_size.copy()
    newnz = round( (maxdepth-surface_depth)/dz )

    new_size[0] = min(prev_size[0],newnz)

    new_intervals =  velocity_model.node_intervals.copy()
    new_intervals[0] = (newnz*dz)/(new_size[0]-1) # issue if new_size[0] is 1

    new_min_coords = mincoords.copy()
    new_min_coords[0] = 6371 - dz*newnz - maxgeocoords[2]

    #create new model
    new_velocity_model = pykonal.fields.ScalarField3D(coord_sys="spherical")
    new_velocity_model.min_coords = new_min_coords
    new_velocity_model.npts = new_size
    new_velocity_model.node_intervals = new_intervals
    new_velocity_model.values = velocity_model.values[-newnz:,:,:]

    return new_velocity_model
def resample_velocity(v,mult=2):
    rho_min, theta_min, phi_min = v.min_coords
    rho_max, theta_max, phi_max = v.max_coords
    nrho, ntheta, nphi = v.npts * mult

    drho = (rho_max - rho_min) / (nrho - 1)
    rho = np.linspace(rho_min, rho_max, nrho)

    dtheta = (theta_max - theta_min) / (ntheta - 1)
    theta = np.linspace(theta_min, theta_max, ntheta)

    dphi = (phi_max - phi_min) / (nphi - 1)
    phi = np.linspace(phi_min, phi_max, nphi)

    rtp = np.moveaxis(
    np.stack(np.meshgrid(rho, theta, phi, indexing="ij")),
    0, -1)

    resampled_data = v.resample(rtp.reshape(-1, 3)).reshape(rtp.shape[:-1])

    v_out = pykonal.fields.ScalarField3D(coord_sys="spherical")

    v_out.min_coords = rho_min, theta_min, phi_min
    v_out.node_intervals = drho, dtheta, dphi
    v_out.npts = nrho, ntheta, nphi
    v_out.values = resampled_data

    return v_out
def compute_tt_raypath(velocity_model, source_position, receiver_position, maxdepth=None):
    # Create a solver object
    solver = pykonal.solver.PointSourceSolver(coord_sys='spherical')

    # shrink velocity model to a shell of thickness maxdepth 
    if maxdepth:
        velocity_model = trim_model(velocity_model,maxdepth=maxdepth)
        #zero_velocity_below_depth(velocity_model,maxdepth=maxdepth)
        velocity_model = resample_velocity(velocity_model,2)
        velocity_model.values[0,:,:] = 0.2 #setting the bottom layer to some small value instead of 0 seems to help stability

    # incorporate velocity model
    solver.velocity.min_coords = velocity_model.min_coords
    #not writeable? solver.velocity.max_coords = velocity_model.max_coords    
    solver.velocity.node_intervals = velocity_model.node_intervals
    solver.velocity.npts = velocity_model.npts
    solver.velocity.values = velocity_model.values
    solver.src_loc = geo2sph(source_position)

    #solve and trace
    solver.solve()

    raypath = solver.trace_ray(geo2sph(receiver_position)) # sph2geo(raypath)
    max_raypath_depth = max(sph2geo(raypath)[:,2])

    traveltime = solver.traveltime.value(geo2sph(receiver_position))

    #how can we get the distance along the raypath?

    return traveltime,raypath
malcolmw commented 11 months ago

Hi Robert,

Nice to hear from you. I think the question you are trying to answer here is how to re-integrate the finite sensitivity kernel of surface waves into the ray tracing procedure, which assumes an infinitesimal sensitivity kernel (via the high-frequency approximation that is made to derive the eikonal equation). I am not an expert in surface-wave inversion but if you would allow me to think out loud here...

I think the trick is to convolve 2-D ray paths computed on the shell of a sphere (see here for a quick example) with finite sensitivity kernels computed using a separate tool such as this Python wrapper of surf96 or disba. In this case, one would need to maintain for each period an "actual" 3-D model of the Earth structure and an "effective" 2-D model for ray tracing. The tricky bit would be going back and forth between the two. I believe this is what Hongjian did here.

With that said, I don't think I quite understand the idea of varying the depth of your model to match travel times without incorporating sensitivity kernels as you have suggested above. Can you help me by elaborating some?

filefolder commented 11 months ago

Thanks for the reply!

I guess what I'm trying to do is less about replicating ambient noise tomo with full kernels, and more about adding a few extra raypaths between stations using FTAN predicted shear velocities. Sort of a low budget ANT, but maybe one that integrates better with body waves? So in that sense the goal is just to modulate the model depth to collect reasonably fitting traveltimes & raypaths to lookup later.

Hongjian was actually visiting us the other week and I mentioned this scheme, but not sure he was buying it. He's come up with a pretty good joint inversion using the same data and his code which I think is via the link you posted, so it's nice to have something to test against....eventually.

Have no idea if it'll work out but I'll keep you posted. Just wondering if this is something you might have already attempted or thought about. Be curious to hear from anyone else as well!

malcolmw commented 11 months ago

Hmmm... I'm still not sure I understand the physical rationale behind varying the model depth or how to update the velocity model in this scheme but if you get some preliminary results you want to discuss that might help. I will leave this ticket open in the meantime in case anyone else comes across it and wants to chime in!

Cheers!

filefolder commented 11 months ago

I guess what I'm trying to do is come up with fake raypaths that could be inverted to yield the same results as ANT, in this example by limiting their depth and increasing their traveltime via the model. If that doesn't make sense you're definitely not alone, it tends to depend on how people intuit that Rayleigh waves actually "sample" the deep crust.

I'll go ahead and close, I don't think this is going to bear fruit but hopefully at least the functions help someone else one day. Thanks again for your replies!

malcolmw commented 11 months ago

I see. I think I am starting to understand. It seems like the problem really arises because ray theory is unable to model the full physics of surface waves satisfactorily. Hongjian convolved ray paths with depth sensitivity kernels; whereas you are trying to use "virtual" (or "fake") ray paths to simulate the sensitivity kernel in some sense. An interesting idea... I'll be interested to see if you take it further!