Open TingLei-NOAA opened 7 months ago
An update on May 1,2024:
1) It was planned to create a mgbf owned interpolation class as the subclass of the central block to do the interpolation (and its adjoint) to provide the mapping between the mgbf filtering grids and the analysis grids. However, it is found the saber::interpolation is being developed, the interface of which has been in saber, to be available maybe in several weeks to be tested/made use of (Thanks to F. Herbert at JCSDA) . Hence, our mgbf interpolation part is given up and a thus-simplified mgbf-jedi interface was done with a few imiportant details to be implemented later, which including the passing of data between jedi fieldset and mgbf "native grid". This version is at the private branch feature/mgbf_tl 2) A important problem to be addressed is that the atlas support to regional domain is found still to be developed . see https://github.com/JCSDA-internal/oops/issues/2575#issuecomment-2085124575 and https://github.com/ecmwf/atlas/discussions/190. Further exploring on this is ongoing and how to proceed in collaboration with jedi core teams is to be discussed and decided.
Thanks to suggestions from @danholdaway , it is decided to use Saber::gsi::interpolation, which has been working for conversion between unstructured grids and lat/lon grids on global domains and Dan believe it should also work for regional domains. Hence it is a good tool for what current mgbf needs.
@TingLei-NOAA as I mentioned, the biggest thing we now need to ensure is that the domain of the model is completely contained by the domain of the background error model. We should put our heads together to figure out a generic way to do that. It should be a check in Saber somewhere.
@danholdaway Sure. That needs to be taken care of when the "mgbf grids" is created, which is now to be planned to create on fly. There are some tricky things I need to sort out when the layout of mpi ranks are pre-defined. I will keep discussions among us ,maybe, in various ways.
A mgbf in jedi/saber using unstructured interpolator from saber::gsi had been created with successful compiling. A key choice is the definition of the type of the mgbf grids. Now it is defined as
const atlas::StructuredGrid grid(conf);
//clt how about PointCloud functionspace
const atlas::functionspace::StructuredColumns mgbfGridFuncSpace_(grid);
Sanity tests are being setup. The following coding/debugging through various tests would delve into "under-the-hood" workings of Atlas and the unstructured interpolator on regional domains
Previously, the structuredcolumn functionspace was used as :
mgbfFuncSpace_=atlas::functionspace::StructuredColumns(grid)
However, it is found that in Structuredcolumns functionspace, there is no lat and lon information about grid points. The lonlat method returned the x/y fields as in atlas/functionspace/detail/StructuredColumns.h
Field lonlat() const override { return field_xy_; }
Hence, now PointCloud functionspace is used and being tested.
Using the pointcloud of atlas::functionspace, the mgbf Dirac test produced a reasonable result as shown below. Further verification (including tests for mpas) and development will be carried out on top of this working version of mgbf in jedi. This working version also confirms both the efficiency and validity of the mgbf implementation strategy designed and decided by the team, including the choice of the gsi::unstructured_interpolator according suggested by @danholdaway
Hi Ting:
What is the blotch that we are looking at? Can you explain the context please?
Jim
On Tue, Jun 4, 2024 at 7:58 AM TingLei-NOAA @.***> wrote:
Using the pointcloud of atlas::functionspace, the mgbf Dirac test produced a reasonable result as shown below. image.png (view on web) https://github.com/NOAA-EMC/RDASApp/assets/63461756/83d2245e-e084-41e4-bda7-192c39f9475d Further verification (including tests for mpas) and development will be carried out on top of this working version of mgbf in jedi. This working version also confirms both the efficiency and validity of the mgbf implementation strategy designed and decided by the team, including the choice of the gsi::unstructured_interpolator according suggested by @danholdaway https://github.com/danholdaway
— Reply to this email directly, view it on GitHub https://github.com/NOAA-EMC/RDASApp/issues/31#issuecomment-2147352343, or unsubscribe https://github.com/notifications/unsubscribe-auth/ASCK3YUAVDLC6I2C5YFJ5GTZFWTXHAVCNFSM6AAAAABHCMK5ASVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNBXGM2TEMZUGM . You are receiving this because you were assigned.Message ID: @.***>
@JimPurser-NOAA In this Dirac test, the variable (No2 here) (on the model grids) is assigned 1 on one point and zero on all other points. Then this field (vector) was multiplied by the B matrix . The above figure shows the field resulting from such operation. For mgbf, the B is composed of MGBF on a regular grid as the central block and the interpolation (from the former regular grids to model grids) and its adjoint on both sides. Hope this explanation helps.
Hi Ting:
Is this a univariate analysis of a single scalar quantity, then?
If so, why is the response so anisotropic -- it just looks like a squashed bug!
Thanks, Jim
On Tue, Jun 4, 2024 at 12:33 PM TingLei-NOAA @.***> wrote:
@JimPurser-NOAA https://github.com/JimPurser-NOAA In this Dirac test, the variable (No2 here) (on the model grids) is assigned 1 on one point and zero on all other points. Then this field (vector) was multiplied by the B matrix . The above figure shows the field resulting from such operation. For mgbf, the B is composed of MGBF on a regular grid as the central block and the interpolation (model grids to the former regular grids) and its adjoint on both sides. Hope this explanation helps.
— Reply to this email directly, view it on GitHub https://github.com/NOAA-EMC/RDASApp/issues/31#issuecomment-2147422754, or unsubscribe https://github.com/notifications/unsubscribe-auth/ASCK3YU5RT4X37RWEXSUWHDZFWXZZAVCNFSM6AAAAABHCMK5ASVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNBXGQZDENZVGQ . You are receiving this because you were mentioned.Message ID: @.***>
Jim, Yes. in this dirac test, it is a just a univariate "operation". I will keep your points on mind in the future more strict verification/validation. For being now, the factors i think of could contribute to the "unstructured" /un-smoothed "anistropic patter include :1) the resolution of the mgbf filtering grids; 2) the plot from ncview , in which ncview treat the points as on regular grids while the fv3 grids are not . 3) possible setup for the mgbf in my test. With the problems you raised , which are to be answered decisively in the future, I think the results are satisfying to me to show the mgbf in saber/jedi is working and hopefully, you and other colleagues would agree on this :).
Following discussion with Jim, another dirac test using 100km resolution (compared with previous 200km resolution) of the filtering grids, the result is shown below:
It seems the patter became smoother but still some irregularities are obvious. Next , we will see the impacts of correlation length scales modelled in mgbf.
The verification/validation/debugging is ongoing by tuning of parameters and tweaking codes.
The mgbf interpolation(and its adjoint) part (using saber::gsi::unstructor_interplator) have passed this cycle of verification and debugging, which are run as outer saber chain blocks.
The self adjoint test failed (invoked by saber's generic adjoint test interface), but it failed for the created random vector on the model grids contain nan values and is estimated not relevant to the mgbf coding, though it would be revisited in the future.
A series of tests for tests of the mgbf interpolation have been conduced, which collectively verified the former. please contact me for more details. Below, I would only show one test, in which, the mgbf processing was skipped and the dirac test is infact compose of the interpolation (from model grids to mgbf filtering grids) following its adjoint initiated by the dirac vector.
Below is first the domains for mgbf filtering grids and model (As @danholdaway stressed, the former should be larger ).
Then the increment of such "dirac" test
Hence, for this cycle, the following verification/debugging would focus on the mgbf step (the central chain block)
With @MiodragRancic-NOAA 's help, the appropriate procedure in mgbf lib has been selected and now, the mgbf at JEDI apparrently works "similar" to the established mgbf behaviors as shown in the attached dirac tests
Further verifications and the calibration of the mgbf coefficients are ongoing
A fv3jedi 3DVar run using mgbf background function has been completed with a single NO2 observation. The results are reasonable and further investigation, including comparison with bump, are to be done. The tests for regional mpas is also being worked on
A mgbf dirac test using two analysis variables had been done showing reasonable results after fixing a bug in the variable passing from jedi fields. (see update july 21,2024 at https://docs.google.com/presentation/d/1Xu01QHNSgZTHrT-BaVVTEuDk8ENbZsed/edit#slide=id.g2ed53956188_0_0) A mgbf mpas-jedi 3dvar has been run as shown below : This demonstrates that the implementation of MGBF with JEDI, which consists of interpolation and its adjoint (using gsi::unstructured_interpolator) as the outer blocks, and MGBF processing on the regular MGBF filtering grids as the central block, works well even for MPAS's unstructured grids.
Update on July 29,2024 1) A rather "realistic" mgbf setup using 25km filtering grids (on the same mgbf domain) had been used in regional mpas-jedi 3dvar using one obs and preliminarily showed reasonable results as below : One current focus is the efficient way to tune/change the characteristic scales of the mgbf based background error covariances. 2) Finished multiple obs /multiple analysis variables mbbf 3dvar for regional fv3-jedi with expected behavior.
Ensemble localization using mgbf/saber is working. The conversion between oops::fields and oops::increment after the localization and its issue when using mgbf/saber have been sorted out and corresponding codes in saber/localization.h have been changed to be able to deal with the current use of data structures passed from mgbf/saber, which should also work for original nicas localization option. Further evaluation/tuning is ongoing. The below is the delp increment from a Ens3DVar fv3jedi test using 20 members with mgbf (left) and bump::nicas and using one temperature observation (mgbf localization is using a much larger "effective" cut-off radius)
An update on Aug. 26,2024: Currently , a mgbf test used for ensemble localization in the ens3dvar shows very close results to the one using bump (localization length scales: 240km/0.3) . The vertical profiles of the increment near the increment center is as below : The remaining problem is for the mgbf test gives a larger maximum value (17 vs 15).
In the ensemble localization, the treatment of 2d variables in mgbf procedure was sorted out and implemented , in which, 2d variables are treated as 3D variables with other vertical levels are initialized as zeros. @MiodragRancic-NOAA clarified that this is not as wasteful as at the first glance because those non-zero values spreaded to the upper levels would be also processed in the horizontal directions. Using one obs test (a single surface observation), it was confirmed this 2dvar treatment capability is working as expected. One example is the temperature increment profile ( on the horizontally maximum increment center , please note, the length scales in either directions are different for bump and mgbf. The focus is on their qualitative behavior).
using the one obtest from GSI and the difference between the analysis and background files (which is not exactly the increment from ensemble covariance and might caused the difference between GSI and other two). What caused the differences are being investigated
Update: in the previous plot, the GSI results were vertically flipped by a mistake in understanding the vertical order of the increment files from jedi. Corrected figures showed the problems that fv3-jedi using bump /mgbf created increments of temperature or wind near the top in a pure ens3dvar using surface pressure obs. Further investigation has sorted out the vertical layout setup (topdown/bottomup) of the fields passed into saber and upgraded mgbf treatment of 2d variables in ensemble localization The temperature increment below showed the right locations of temp increment through covariance between temperature and surface pressure ,while the differences between gsi/bump/mgbf might be caused by quite a few factors, first the different length scales..
Update on Oct. 7,2024: Two new functions have been added and evaluated recently : one the normalization step and the mbgf filtering on coarser vertical levels. Preliminary tests showed they work as expected .
update on Oct. 21,2024 1)We have several setup of mgbf to get a dirac response close to 250 km cut-off radius by bump as below 2) using of the above setup, a set of oneob tests (dirac, 3dvar, Ens3dVar and hyb-3dvar) for fv3jedi and (hyb-3dvar) for mpas-jedi had been finished. 3) continued investing the issue with mgbf_line=.true. and mg_proc=4 in collaboration with MGBF team members at EMC
Update on Oct. 28,2024. A fix in the setup of the mgbf parameters in the current mgbf_lib in jedi allowed mgbf using line filter behave as expected (also avoiding suspicious negative values ) as shown in below: Further tuning to make the mgbf response, in the horizontal directions, closer to the gaussian lines corresponding to 250km cut-off radius is ongoing.
An satisfying (to some extent, at least ) mgbf setup was found . Its' response from a dirac impulse is shown below : If there are not other comments/suggestions from other MGBF team member, I would stick to this setup in the following MGBF tests using real data.
As a part of the EMC efforts for MGBF (Multigrid Beta function modeling of background error covariances) for regional JEDI, a MGBF interface in saber/jedi using the newly developed oops interfaced MGBF lib is being created, which is supposed to be used as starting point/testbed for MGBF developers directly to work on JEDI applications. This issue will facilitate communications and collaborations for MGBF with JEDI. Relevant important design decisions by MGBF developers at EMC include: the regular grids will be used for the filtering grids and the mapping between these filtering grids and the analysis grids (native model grids in jedi) will be deal with by a rather generic atlas based JEDI interpolator which is actively being developed by JEDI core team.
It is noted: the GSIBE component in saber has been used as a good example for this work.