Closed perolavsvendsen closed 5 months ago
Off-line discussions summary:
Two-step generation of pre-processed data:
Step 1: In the template, script runs, exports "staged" data with partial metadata defined. Export to a suitable area, suggested: <template>/share/preprocessed
. fmu-dataio typically runs from RMS in this context. Metadata shall contains a selection of blocks, specifically data
but possibly also others, such as fmu.model
perhaps.
Step 2: In a PRE_SIMULATION hook workflow, script runs, fetches the "staged" data and a) completes the metadata, and b) dumps to correct place on /scratch (<case>/share/XX
).
In step 2a, case-specific parts of metadata must be inserted, e.g. fmu.case
and others. tracklog
must be appended.
Also possible to do sanity-checks against global_variables
in this process, particularly against masterdata elements etc.
realization
fmu.workflow
attribute to metadatafmu.workflow
not requireddata.time
tag.PO and Jan to have a mini workshop to sort out the ERT workflows etc.
Offline discussions 21 Nov 2022
Must be clear in outgoing data if they represent a horizon, or an interval. Current setup invokes a data.top
and a data.base
if representing an interval.
top
and base
is present, or none.top.offset
and base.offset
. Do the same as in current practice. Currently giving e.g. "z-range" for symmetric intervals.seismic.offset
to seismic.stacking_offset
Possible API to dataio.ExportData():
edata = ExportData(
name="MySpecialIntervalName",
seismic={"attribute": "rms", "filter_size": 12.0, "stacking_offset": "0-15"}, # <-- + more stuff
interval={"top":
{"name": "MyName", "offset": -2.0, "stratigraphic"=True},
{"base":
{"name": "MyName", "offset": 6.0, "stratigraphic"=True},
},
)
top/base offset may have different unit than the actual surface. Example: A map show a (unitless) attribute, within an interval defined in meters.
Consider change definition to:
data:
top:
offset:
value: 2.0
unit: m
Prepare for future OSDU storage etc.
Data extracted from seismic typically co-visualized with eclipse output (zones). Very difficult to pinpoint (in metadata) across the various coordinate systems (while missing a good translation between them), e.g. stratigraphy, layers, (model) zones, depth, time. For now, expect Webviz to show lists of available data to user, and user to choose what to visualize.
Possibly include a locally defined "internal_zone_reference"? <-- Will most likely cause more harm than good in the long run.
@jcrivenaes @ezaezaeza + Åshild
Latest fmu-dataio should be in Komodo testing
now.
Quick status: Waiting for next release of Komodo. Next step while waiting; create hooked ERT Workflow Job for the second run on fmu-dataio.
Waiting for next release of Komodo. As above.
No next version of Komodo yet. Workaround is a venv detecting the current Komodo version. Challenge related to subfolders.
Feedback Åshild 2023-01-04:
Status:
Webviz-4D version now running on both JS and Drogon, getting data from Sumo. Discussions ongoing.
Need some testing for the end users. How does this behave if user do not intend to create meta-data (switch). Need for some stress testing as part of the next release changes.
Currently: Working on export of single (simulated/produced) cubes. Still in "feedback from Åshild"-mode. Also waiting for seismic upload to Sumo (fmu.sumo.uploader)
ERT-observations - possible to upload? Existing standard (yaml).
Drogon seems OK JS starting to implement upload of observed maps for Webviz-4D
This has been on the block for a while, and is addressed also in other issues. Following the refactoring into 1.0.0, it is time to properly describe the requirements and start developing on this context.
Pre-processed data are data that are made up-front, prior to a case. Data are typically made directly on the template project, and stored with the template project. When a case is initialized, these data are ferried to the case structure usually by a HOOK workflow or similar.
Example Seismic observations represented by a seismic amplitude map (surface) are pre-made by a manual workflow in RMS. The results are stored with the template project (/share/observations). Responsibility fmu-dataio: Export template data with incomplete metadata (missing the
fmu
block). When an FMU case is initiated, fetch these surfaces with a workflow job, complete the metadata (add thefmu
block), store to correct location in the case structure.(edited post offline discussion aug 25 2022)