iho-ohi / S-101-Documentation-and-FC

Repository issues of S-101 document and feature catalogue
22 stars 5 forks source link

Load/unload Algorithm #55

Open Christian-Shom opened 1 year ago

Christian-Shom commented 1 year ago

The loading and display strategy is currently explained in a (complicated?) textual form in 4.7.1 and 4.7.2. There seems to be a consensus on the need of an algorithm that could be used by the OEMs and could be easier to understand for the HOs than long text as it is. Discussion required on what is needed (algorithm or Meta Code?)

Loading algorithm-2.docx

MikusRL commented 1 year ago

Just a comment on terms: is SENC still as a term retained in S-100 ECDIS, or ENDS should be used? Or is it SENC for any S-57 loaded ENCs, and ENDS for S-101 loaded ENCs... I was of impression, that in the newest ECDIS documentation proposals to IMO the term ENDS now is proposed in the place for SENC for use in S-100 ECDIS. Please correct me if I am wrong.

plebihan-geomod commented 1 year ago

Dear Mikus, Agree that the "SENC" word is not the good word taking account new terms used in S-100 ECDIS. To my understanding, there are two new Terms in this documentation S-100 ECDIS ( == NCSR 9-WP.6 - ECDIS_PS).

Christian-Shom commented 1 year ago

Here is Holger's contribution for an algo under the form of Meta Code. Loading algorithm_Holger_SevenCs.pdf

plebihan-geomod commented 1 year ago

Dear Holger,

I need just precision in order to be sure to undertsand and to go further : Perhaps you miss to give the definition of the function "scaleBands(item)" that could be something like : scaleBands(item) : For each datacoverage of item (== a dataset) , call the sub_function "GetScaleBandsForCoverage" to get the associated scale bands ....

Correct ? Pol

HolgerBothien commented 1 year ago

Hi Pol, The definition of scaleBands(item) is under preconditions. It returns the set of scale bands that are associated with one inventory item. Each item consist of :

  1. The geo plygon of the coverage
  2. The set of scale bands
  3. the associated dataset.

Note that there will not one item per dataset but one for each coverage of a dataset. Means each item is associated with exactly one dataset but a dataset can be referenced from several items.

plebihan-geomod commented 1 year ago

Bonjour Holger,

Sorry for the delay . Taking account your last comments , I have began to implement your algo.

First , I propose you some clarifications on your documentation (see my proposal joinded).

If you are agreed with these clarifications, I will finalize my implementation and confirm my results .

Pol

Loading.algorithm-propositions-pol.docx

HolgerBothien commented 1 year ago

Hi Pol, I agree with your comments. For the output I would rather say: 'A set of selected inventory items S' But here the native speakers should have the last word :-)

plebihan-geomod commented 1 year ago

Bonjour Holger,

My result (to be confirmed) of a first quick implementation : This algorithm deals correctly but strictly based on maximum and minimum display scales of an item (data-coverage) and in turn an item in overscale situation (maxDS > MSVS) will never be selected. But we need in some situation to select an item in overscale situation to fill the viewport (eg a "zoom in" action in a datacoverage) The opposite situation , no selected item due to an underscale situation (the MSVS < all minDS), has been accepted (this point is also to be confirmed).

Are you OK with this first remark ? If Not, Stop here ,I will come back to my implementation.

If Yes , Here are two proposals to resolve the problem of "no selected item" in overscale situation :

Note I have tested only the first proposal and seems to be OK.

Pol

Loading.algorithm-propositions-pol-with-two-loops.docx

plebihan-geomod commented 1 year ago

Loading.algorithm-propositions-pol-with-two-loops.docx

It seems that the proposal with two loops doc is not accessible, I put it again.

HolgerBothien commented 1 year ago

Hi Pol, Your comments to the document are all valid.

The SelectDataSets methods should not call scaleBands(item) but GetScaleBandsForCoverage. Another option is to define the method scaleBands() which will call GetScaleBandsForCoverage.

Instead of Output: A set of inventory selected items S it should read Output: A set of seleceted inventory items S

We will definetely need more explanations to describe the algorithm and its sub procedures.

The approach with two loops I haven't understood. Why do we need it if the one loop algorithm does the job. The question if the algorithm works on a sorted list of items is an implementation detail and in my opinion not necessary for the correct definition of the algorithm itself. Looking forward to the discussion in the S100PT Meeting.

mikan66 commented 1 year ago

We, (NIWC), have not attempted to implement this algorithm yet. However, for consideration, attached are some of our issues and concerns about the S-101 data loading algorithm (prior to this thread's version). Perhaps this algorithm solves all of our questions? Those that are implementing this algorithm may consider testing scenarios from the attachment. S-101_Data_Loading.pdf

plebihan-geomod commented 1 year ago

Bonjour Holger,

Sorry for the delay.

"The approach with two loops I haven't understood. Why do we need it if the one loop algorithm does the job."

Agree : I realize my two loops proposal is wrong . I have merged several implementations, sorry for that. Your one loop algorithm seems to be good (see my presentation). I also propose to give up the first proposal with ordered items that could lead to something more efficient (== avoid consult list of data coverage list for each scale band SB) but I don't want to introduce more confusion with that.

So we restart the process of test with the document "Loading.algorithm-propositions-pol" as a reference. I will realize as soon as possible a separate and complete implementation.

Pol

ProposalDescriptionAlgorithm-Holger.pdf

plebihan-geomod commented 1 year ago

Mikan,

Here my comments on issues you raise in your document.

S101-Data-Loading-Pol.pdf

Pol

DavidGrant-NIWC commented 1 year ago

Agree about the description of the issue. But to my mind it is not an issue due to S-101 specification, == the issue already exists in S57/S52 specifications when two datasets at the boundary have not the same navigation purpose. In S100 world you just replace navigation purpose list by maximum display scale list.

But S-101 datasets can have multiple scales (multiple DataCoverage features), whereas S-57 datasets can only have a single navigational purpose.

In any cases, Producers must harmonize their datasets at the boundary (== produce consistent datasets at the boundary (== with the same "maximum display scale"

This implies that all data producers will have harmonized cataloging/tiling schemes. This seems unrealistic. Also, as noted above, datasets can have more than one scale.

mikan66 commented 1 year ago

Hi Pol, I appreciate you taking the time to consider our concerns. I defer detailed comments to my colleague Dave Grant. However, as I pointed out in the Wellington meeting, I think there is a chance to harmonize ECDIS display presentations with S-101 to avoid continuation of problems from the past (S-57). As you stated: "In S57 world , it depends of the manufacturer's implementation".

plebihan-geomod commented 1 year ago

Mikan and David

about the remark of David "But S-101 datasets can have multiple scales (multiple DataCoverage features), whereas S-57 datasets can only have a single navigational purpose."

I try below to define my view of the fact that there is no fundamental difference between ENC/S57 and ENC/S101 specifications for dealing the two following separate processes :

1**. Select DataSets Process**
    ◦ In S57, you have to taking account the main dataset with its meta attribut compilationScale and potentially its insets as meta features M_CSCL. 
    ◦ In S100, you have taking account the main data coverage and its maxi/mini display scale attributes and potentially its insets defined as sub data coverages. =>holger  loading algo document.

2. **Render Selected DataSets Process**
    ◦ In S57, you have to render  selected dataset and its insets (M_CSCL)   during the rendering of the associated navigation purpose.  
    ◦ In S100, you have to render  selected dataset and its insets (datacoverages)  during the rendering of the associated maximum display scale of the dataset. The associated maximum display scale for a dataset is the maximum of the maximum display scale of data coverages, and in turn the dataset will be rendered in one step (no separate rendering process for each data coverage). 

== for the render process : in S100, maximum display scale (15 values) play the role of S57's navigation purpose (6 values).

Pol

plebihan-geomod commented 1 year ago

Holger,

I have realized a first implementation of our algorithm and confirm it runs (tests realize on simple cases). I have also add some clarifications , if you have time to read and confirm again the following document : Note I have change "selectDataSets" by "selectDataCoverageSets" because the result is a set of data-coverage.

Loading.algorithm-propositions-pol-2.docx

The next step is go on to the rendering process with the output of "selectDataCoverageSets". I guess We have to take care in the description of this rendering process that we have to render "dataset" not "datacoverage".

Pol

HolgerBothien commented 1 year ago

Hi Pol, thank you for proving the concept. I confirm the changes and will add the context to the S-101 PS Annex D. Jeff has prepared a document with an older version of the algorithm and I will replace it with your version. I will remove only the Note: Note that the minScale must be inferior or equal to scale It is not adding value and I believe that it should read less or equal. Nevertheless, I will remove it. Thank you again.

DavidGrant-NIWC commented 1 year ago

The associated maximum display scale for a dataset is the maximum of the maximum display scale of data coverages [...] == for the render process : in S100, maximum display scale (15 values) play the role of S57's navigation purpose (6 values).

Two datasets (D1, D2) each have a single dataCoverage of scale 1:22k. They should render side-by-side.

Now add a second dataCoverage feature with scale of 1:8k to dataset D2 (an inset). The two datasets will no longer render side-by-side; D1 will render in its entirety, then D2 will render in its entirety. Objects which extend beyond the boundary of D1 into the area of D2 will be obscured.

This does not happen in S-57/S-52 (because the navigational purpose of the cell doesn't change with multiple M_CSCL). My comments to the relevant section of the S-101 PS attempted to address this issue.

mikan66 commented 1 year ago

Attached is our rendering of the SHOM datasets provided to us in May 2023. This does not use the 7Cs algorithm. SHOM_Dataset_NIWC_comments.pdf

Christian-Shom commented 1 year ago

Back to Dave's comment on 7 Dec 2022:

Now add a second dataCoverage feature with scale of 1:8k to dataset D2 (an inset). The two datasets will no longer render side-by-side; D1 will render in its entirety, then D2 will render in its entirety. Objects which extend beyond the boundary of D1 into the area of D2 will be obscured.

Is there a use case for this? Datasets with the same scale range as D1 and D2 must not overlap. Hence, I don't see the issue if a second larger scale Data Coverage is added to D2.

DavidGrant-NIWC commented 1 year ago

The issue is caused by features which may extend beyond the boundary of the dataCoverage:

image

To ensure a seamless presentation in the case where the datasets abut they must be drawn "side-by-side". Here is the requirement from S-52: image

DavidGrant-NIWC commented 1 year ago

I opened issue #71 to address the display issues. I'll try to limit my comments in this issue to the S-101 PS 1.1.0 Annex D - Dataset Loading Algorithm.

  1. Recommend using existing terminology when possible. a. dataCoverage rather than item b. minimumDisplayScale rather than minDS c. polygon(dataCoverage) rather than poly(item) d. etc.
  2. The algorithm compares scales (1:1,000) to scale denominators (1,000). Although it's apparent what's intended, recommend for consistency and ease of implementation that scale denominators are always used, or that the table be modified to show scales vice scale denominators.
  3. The Minimum Scale of scale band 1 should be shown as (or 1:∞) rather than NULL.
    image
  4. Although the ^ operator can be used to represent Logical conjunction, it is widely used in computer programming as the exclusive or operator. Recommend replacing ^ throughout the text with AND to avoid confusion.
  5. minDS can be NULL in algorithm scaleBands(), and can never be < maxScale[1] a. Recommend step 3 be modified to: If 𝑚𝑖𝑛𝐷𝑆 is NULL b. Step 4.a should be updated to handle 𝑚𝑖𝑛𝐷𝑆 = NULL.
  6. Algorithm SelectDataCoverages() selects dataCoverages (denoted as items). a. Noting that datasets should always be loaded (and displayed) in their entirety, recommend that either:
    • D-2 is renamed and modified to select datasets, or
    • Note somewhere that the dataset associated with each selected dataCoverage should be loaded.
HannuPeiponen commented 11 months ago

I looked the github issue. Based on it I drafted an alternative solution

image

This draft solution has following benefits: • Only DCFs within the GW are sorted • Sorting use only value of MaxDS. You ask why. The reason is to avoid symbol congestion. It is better to use coarse version in the display instead of too dense alternatives. • The actual drawing is by display priority layers. You ask why. The reason is to have same result as with current S-57/S-52. This means mixing of DCF within same scale range for drawing of GW. Remember the NIWC reported problem of cut text, cut light sectors, etc. caused by the cut of drawing at the edge of each individual DCF. • The actual drawing is also optimized for not drawing already drawn areas within the GW

Christian-Shom commented 10 months ago

For Sub-Group meeting 2023_09_07:

DavidGrant-NIWC commented 10 months ago

The upcoming release of the ShoreECDIS component of our testbed implements the loading strategy currently documented in S-101 (Holgers algorithm above). The testbed will be available for download from the IHO registry: IHO Registry Help and Guidance Repository image

DavidGrant-NIWC commented 10 months ago

@HolgerBothien please check these edits and modify as desired. If you plan to attend the upcoming meeting, could you submit the change proposal?

S-101 Dataset Loading Algorithm.docx S-101 Dataset Loading Algorithm.pdf

alvarosanuy commented 10 months ago

Australia is against the current behaviour of the Loading algorithm were preference is giving to loading over scaled ENC instead of under scaled ones.

Well established cartographic practices are against this approach for a number of reasons. The main reasons being:

Producers have to focus on implementing effective Scale Minimum (SCAMIN) practices.

This behaviour is currently required by S-101 PS (4.6) in the yellow section although it seems to conflict with the statement in the 3rd paragraph: image

RichardCoyles commented 9 months ago

UK is also is against the current behaviour of the Loading algorithm, if preference is giving to loading over scaled ENC instead of under scaled ones, for the reasons outlined by Alvaro (above).

HannuPeiponen commented 9 months ago

The focal point of the loading algorithm is congested or non-congested result. The issue is related to use of SCAMIN. SCAMIN should be used to remove congestion. If SCAMIN is not used at all or not used sufficiently, the result for viewing is not usable by mariners. We all should serve our end user - the mariner. The big dilemma compared to paper charts is that mariners can view the nautical chart using larger or smaller scale than the "ideal/compilation" scale. Overscale do not increase congestion/clutter while underscale do increase, unless SCAMIN is used. If the Loading algorithm preference is under scaled then Data Quality validation tests shall include check of proper use fo SCAMIN to avoid congestion. Congestion means for example no overlapping isolated danger symbols, no overlapping sounding values, no overlapping depth contour lines, etc.

DavidGrant-NIWC commented 9 months ago

Modified algorithm: try to fill the screen using data from the selected scale band, then from the "better" scale band adjacent to the selected band, then from each smaller scale band. This is not what is being proposed ("prefer loading over scaled ENC"); the algorithm only does this sometimes. The requested algorithm would load at least nine additional large-scale charts in the bottom right picture. Even with the restriction the algorithm doesn't work very well.

The pictures don't reflect the fact that the screen takes much longer to fill, the data uses considerably more RAM, and this is not even close to a worst-case depiction.

Current Algorithm Modified Algorithm
ShoreECDIS 2023-09-13T164150Z ShoreECDIS 2023-09-13T164030Z
ShoreECDIS 2023-09-13T162802Z ShoreECDIS 2023-09-13T163950Z
HannuPeiponen commented 9 months ago

Thank you David. Your pictures clearly show what we manufacturer have tried to tell to Alvaro and Richard. The result is clearly so congested that it is not usable for navigation. One could argue that with proper SCAMIN this problem could be eased, but in such case the S-101 validation tools should have intelligence enough to find "too less use of SCAMIN cases" before these charts cause congestion when used in ECDIS. Think about judgement by mariners, they will have possibility to use either S-57 or S-101 ENC charts as both will be available for a transition period. Those who try S-101 will find the result unusable and move back to S-57.

alvarosanuy commented 9 months ago

We should probably select an example where SCAMIN has been used .... In this example, it looks like no one object has SCAMIN encoded! Something for the RENCs to add to the list of important things they should look at, before releasing products to the market. Production software do offer automated SCAMIN processing (based on standardised IHO guidance in the DCEG)!!

Zooming out should not make a larger scale product to display. Is this not the case in the 2nd example (inland waterway)?

RichardCoyles commented 9 months ago

Its difficult to see the setting that are used to produce the screenshots and so comment, but I think we agree that in the cases shown SCAMIN should be used to control congested underscale (it does not look like it has been applied here) I agree quality checks for correct application of SCAMIN is crucial. I dont think the interest of the mariner is served if we offer a generalised overscale picture, rather than underscale (with SCAMIN applied)

HolgerBothien commented 9 months ago

The use of SCAMIN, which I have my doubts, that this will be done by all producers in a consistent way, will only solve a part of the problem. The modification will load many additional data sets with high resolution geometry used at a scale that is far to small for them. Not all objects can be removed by SCAMIN and the datasets will by shown on a stamp size area on the screen. That doesn't help the mariner. The performance goes down and that makes the mariner really upset. We are on the best way to kill S-100. And reading the argument that the area which is covered by a symbol is relevant to the safety of navigation has really horrified me. The proper solution is to produce data that have no gaps, no holes or other artifacts. The ECDIS is not supposed to create a chart image by solving a jigsaw puzzle.

Christian-Shom commented 9 months ago

Trying to summarize (I've added a sentence in italic based on Holger's comment):

My personal opinion is that the solution is clearly in the hands of the HOs who must deliver data that is readable at all scales where the dataset is supposed to be displayed. This includes having a consistent ENC portolio and scale minimum policy applied. Why not having a Critical check on scale minimum (This would not be an easy one to implement, but I think it is possible). I'm not sure if we can put the responsibility to reject a dataset on the RENCs if not based on a validation check... We will discuss on this in Lombok.

DavidGrant-NIWC commented 9 months ago

Data Providers (HOs) don't want to allow over zooming on a dataset because it is not safe;

Changing the loading algorithm doesn't prevent the user from zooming beyond the intended usage of even the best-scale dataset. The issue under discussion is an effect which occurs mainly when the mariner zooms out, or when the best scale data has a small scale (coastal/general) but doesn't fill the screen. In these cases, the surrounding data is just helping to provide situational awareness.

HolgerBothien commented 9 months ago

S-101 ENC_Product_Specification_Edition 1.2.0.20230915_hb.docx

HolgerBothien commented 9 months ago

I have modified the loading algorithm (only names of variables to improve readability). And I have made a sketch of a display algorithm. Please review, it is intended as a starting point for the discussion.

DavidGrant-NIWC commented 9 months ago

Please see my comments and redlines previously posted: https://github.com/iho-ohi/S-101-Documentation-and-FC/issues/55#issuecomment-1710904724

I think the only significant difference between your redlines and mine is that I modified the algorithm to always use denominators rather than comparing scales to scale denominators. I'd also recommend changing dataCoverage to Data Coverage for consistency with the rest of the document.

Data display algorithm should be addressed in #71

JeffWootton commented 8 months ago

I have attempted to amend the algorithm in accordance with Dave's and Holger's suggestions:

S-101 Main Document_ANNEX D 20231107.docx

It is a little confusing in regard to reconciling the scales and scale denominators. Any feedback welcome, particularly from Dave and Holger.

DavidGrant-NIWC commented 4 months ago

@JeffWootton I missed your previous post.

The algorithm should not have been modified to use optimumDisplayScale. This change causes datasets to be loaded only when the display scale is between the min and optimum scales. The data will not be loaded when the display scale is between the maximum and optimum scale of the dataset.

Data is intended to be used (loaded) between the min and max scales. The optimum scale should primarily be used for the overscale indication (e.g., "X1" when display scale equals the optimum scale), whereas the max scale affects data loading and is also used to determine when to add "jail bars".

An example dataset with multiple data coverage features:

DataCoverage min optimum max
A 180k 90k 45k
B 180k 45k 22k

This dataset will be loaded when the display scale is between 22k-180k. It can also be loaded outside of this range to fill in what would otherwise be empty areas of the screen.

At a display scale of 90k, the overscale indication will show "X1" for the area of A, and "X0.5" for the area of B.

At a display scale of 22k, "jail bars" will be shown on A (provided that it was used to fill in an otherwise empty area of the screen), but not on B.

HannuPeiponen commented 4 months ago

I agree with David. His description is also my view.

However, one small detail in the description of David should be changed - there should be no overscale indication if the overscale factor is 1.0 or smaller (i.e. never show "X1" or "X.5"). I mean that overscale indication should be limited to real overscale starting from X1.1

DavidGrant-NIWC commented 4 months ago

there should be no overscale indication if the overscale factor is 1.0 or smaller (i.e. never show "X1" or "X.5").

Agreed, I was just indicating the basis for the calculation is the the optimumDisplayScale.

Christian-Shom commented 4 months ago

After the S-101PT12 session 2 meeting on 7 March 2024 and discussions on paper https://iho.int/uploads/user/Services%20and%20Standards/S-100WG/S-101PT12/S-101PT12_2024_06.23_EN_Optimum_Display_Scale_V3.pdf, I suggest:

A new issue could be opened to bring comments on the Loading Strategy as defined in S-101 Ed.1.3.0 through testing of NIWC ShoreECDIS 1.6.0.0.

benhazelgrove commented 1 month ago

AHO would like to have the group consider the following: Larger scale ENC cell MinDS = 1500000 Smaller scale ENC cell ODS = 1500000 This encoding is to be allowed and the Loading/Unloading modified to "drop" the larger scale cell at MinDS -1 (1499999) This is to allow the cell to be displayed out to viewing scale of next smallest product available, rather than defining MinDS = 700000 and allowing an Overscaled product to take over the screen when some ECDIS's allow for MSVS to be between 700000 and 1500000. This is achieved by "rolling" the mouse on the ECDIS screen. MaxDS is only used when a larger scale cell is not available.