Closed tbenthompson closed 2 years ago
@brendanjmeade @jploveless
@tbenthompson 1.a. - 1.c would be amazing!
Just a quick note: I think the planar projection is causing the H-matrix approximation to perform slightly worse than I would've expected. This is because the triangular elements aren't in exactly the same place each time. This isn't a problem. But, I just wanted to record this thought somewhere and this seemed like the logical place.
The difficult parts of 1a-1c are done. The remaining bits are easier but still necessary:
I'm going to wait on these next steps until Brendan (and Jack?) have had a chance to look through and comment on the current state.
The difficult parts of 1a-1c are done. The remaining bits are easier but still necessary:
wrapping it up nicely to be used automatically. computing the column norm preconditioner without access to the full original matrix.
Is this currently done on the lines?
col_norms = np.linalg.norm(Xp, axis=0)
XpP = Xp / col_norms[None, :]
I think that we'll have the TDE -> velocity matrix and smoothing matrices at least at some point. Can we calculate col_norms for each of these when they exist (ephemerally) and would that be enough?
building the H-matrix and the sparse solver for multiple meshes. The simple solution here involves running a for loop, but it actually might be helpful to combine all the meshes into a single mesh for the sake of efficiently constructing an H-matrix tree.
@jploveless Do you think we can convert the "blocks"-style input files to "celeri"-style input files for the 3 subduction zone Japan model? That would be a good 3 mesh model that we care about running soon. This is a big test because the single mesh Cascadia example is an oddball because it just has one mesh. Many meshes will be the norm going forward.
I can start working on the full global block model conversion but we'll probably need one of the old meade02/03 computers to actually store all the matrices for this. Not sure but I can at least get the files formatted correctly for celeri.
Identifying a cutoff distance beyond which we don't trust the TDE calculations? At a very minimum, stations in Tibet can ignore TDEs in the Caribbean.
I agree. I'll add a field to the *_command.json file that specifies this distance, and that I'll work on propagating it through to the Okada and TDE calculations.
I'm going to wait on these next steps until Brendan (and Jack?) have had a chance to look through and comment on the current state.
I'll be doing this next week. I can confirm that it hmat_dev.ipynb runs fantastically for me!
I think that we'll have the TDE -> velocity matrix and smoothing matrices at least at some point. Can we calculate col_norms for each of these when they exist (ephemerally) and would that be enough?
Yes, that should work. I think we'll have to compute the columns one chunk at a time for each mesh since we only plan to store one mesh at a time of the full original matrix.
@jploveless Do you think we can convert the "blocks"-style input files to "celeri"-style input files for the 3 subduction zone Japan model? That would be a good 3 mesh model that we care about running soon. This is a big test because the single mesh Cascadia example is an oddball because it just has one mesh. Many meshes will be the norm going forward.
I added Japan input files in https://github.com/brendanjmeade/celeri/commit/942b5de499f247f0ed87cda888f544b4a8af3296 as well as a quick conversion utility for meshes that are only in .mat format. Gotta run to meetings but I hope to check these files running through celeri sometime this week.
@jploveless Thanks for this! I've done a bit more here. The command file in this folder now points to the correct files, and there's a mesh_parameters.json that points to the three meshes. The calculations run somewhat. The partials get calculated (not sure they're correct) but there's an indexing error in the construction of the weighting vector. I'm betting this is because it's not handling multiple meshes correctly.
There's now a dedicated notebook for Japan: celeri_direct_dense_japan.ipynb
which was created with commit: https://github.com/brendanjmeade/celeri/commit/c94db2dddda8f526db5b926d62d7a42592dc25f0
This is a fantastic set of files to really test celeri on and I'm really excited to get the classic Japan model working here!
I'll play around with this now. It does seem like some indexing issues in moving to multiple meshes. Thanks for the cleanup of the files!
On Wed, Feb 9, 2022 at 12:18 PM Brendan Meade @.***> wrote:
@jploveless https://github.com/jploveless Thanks for this! I've done a bit more here. The command file in this folder now points to the correct files, and there's a mesh_parameters.json that points to the three meshes. The calculations run somewhat. The partials get calculated (not sure they're correct) but there's an indexing error in the construction of the weighting vector. I'm betting this is because it's not handling multiple meshes correctly.
There's now a dedicated notebook for Japan: celeri_direct_dense_japan.ipynb which was created with commit: c94db2d https://github.com/brendanjmeade/celeri/commit/c94db2dddda8f526db5b926d62d7a42592dc25f0
This is a fantastic set of files to really test celeri on and I'm really excited to get the classic Japan model working here!
— Reply to this email directly, view it on GitHub https://github.com/brendanjmeade/celeri/issues/73#issuecomment-1034004167, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABZQE4QN2Z4O5RTKEG7K3PLU2KOWPANCNFSM5NSL44TQ . You are receiving this because you were mentioned.Message ID: @.***>
All packed nicely in celeri_solve command line interface and still nicely debuggable in notebooks.
Just to follow up on the ideas we talked about:
The only critical step for now is #1a-c, but I wanted to include the other three as a sketch of the opportunities that are available.