openego / data_processing

(geo)data processing, database setup, data validation
GNU Affero General Public License v3.0
7 stars 5 forks source link

Use OSM data for RES positioning #311

Open nesnoj opened 6 years ago

nesnoj commented 6 years ago

(preface: this is not issue to be solved in eGo but is related to the developed methods)

There's some room for improvements in the current RES positioning, especially the raster-based methods (WEC, low voltage PV) are inaccurate. Therefore, I'd like to use OSM data to achieve better results. A short description of the current methods can be found in this paper (p 2ff).

Considered types

OSM data

Approach Use OSM data where available and current methods where data is missing or insufficient.

Steps

  1. Evaluate OSM coverage (latest OSM dataset) for the above types to see if it's worth the effort
  2. Alter the current scripts for RES positioning
  3. Validate

Questions

PS: The MaStR may be released soon, but the start has been postponed...and postponed...and postponed...so not sure if we'll ever get it.

Ludee commented 6 years ago

How close should this be to the eGoDP REA? I mean does it use the MV GD and allocates for each of them? As far as I know there is not much data on LV PV in OSM.

nesnoj commented 6 years ago

Yes, I'd definitely use the MVGDs since every RES requires a grid connection to HV anyway and I think the developed approach is good. Concerning LV PV: You are right, the focus is on MV WEC+PV. LV PV might not be touched.

Different from the first post's method, a more recent RES data source could be used:

  1. netztransparenz.de -> initial data, locations are given by addresses :(
  2. BNetzA list -> updates since new ARegV (2014), locations: UTM coords :) Caution: For some reason, the recent version seems to lack the geo positions (the 12/2017 version can be used instead)

So, we have 2 possibilities:

  1. Update eGo dataset only using data from dataset 2.
  2. Create complete new dataset from datasets 1 and 2 in data processing.

In both cases, OSM points which are not covered by dataset 2 (which holds UTM coord) are used for redistribution of older RES without coords.

Alternatively: The OPSD project is up to release a new dataset based on those data but I neither know the spatial resolution (will they geocode the adresses?) nor if an update is already scheduled. @wolfbunke, any updates here?

nesnoj commented 6 years ago

@ulfmueller @wolfbunke @IlkaCu We're planning this RES update for another project. According to my information, there are no further data updates planned in eGo. From your point of view, is it worth to implement this update to work / be in line with the rest of the DP?

ulfmueller commented 6 years ago

We are right now creating a 0.4.4 and 0.4.5 (clean-run of 0.4.4) dp version. Right, that shall be the very last data set generated in the open_eGo project. Definitely it would be nice if there could be a further dp version in which your changes are included. Very interesting! It would be probably nothing we would have time to evaluate in depth but may be it could be used in other projects. Do you think it would be too much effort to implement it in-line with the rest of the DP? I just can say, that we unfortunately will not have time to coordinate and/or adjust things in the 'rest of the DP'.

Ludee commented 6 years ago

We will try to make this task compatible with the eGoDP. But it will be separated to not mess anything up ;)

I need an European OSM version anyway, so I will prepare a new upload from Geofabrik. We can choose between the latest monthly (europe-180801.osm.pbf) or the yearly LTS (europe-180101.osm.pbf).

Any wishes?

ulfmueller commented 6 years ago

I like the approach to make it compatible but without messing up stuff which is working and in use!!

I am bit curious, how is your work-flow to separate it? (code-wise and data-wise) code-wise: creating a new branch within the current data processing? (vs. creating a new repos) data-wise: creating new tables in existing model_draft and other existing schemata? (vs. creating new database/schemata (probably not))

I somehow think it might be a good a idea to share this work-flow in a transparent way (upfront) to avoid any possible crashes.

nesnoj commented 6 years ago

Good point! Code: A new branch should suit our needs. Data: To avoid interferences with current dp runs, we need to create new tables. Same name with an extra prefix or something. Furthermore, all functions that alter other tables which we do not need for testing should be commented out.

@Ludee: We need to know which tables are affected - is the BPMN still valid? @Ludee: I'd prefer the latest version of OSM.

nesnoj commented 6 years ago

I prefer an update of the eGo data with the BNetzA dataset only. Although we take along errors from the eGo data, I'd avoid a complete re-evaluation of the vast RES data from the TSOs again (OPSD was created to do this).

Concerning licenses of new RES data:

1) BNetzA:

2) Netztransparenz: According to the imprint, data must not be published elsewhere (could not find other licensing notes): "Inhalt und Gestaltung der Internet-Seiten sind urheberrechtlich geschützt. Eine Vervielfältigung der Seiten oder ihrer Inhalte bedarf der vorherigen schriftlichen Zustimmung per E-Mail der deutschen Übertragungsnetzbetreiber, soweit die Vervielfältigung nicht ohnehin gesetzlich gestattet ist." No idea if it is granted by law or not..

nesnoj commented 6 years ago

An updated proposal which incorporates all aspects from above: (please amend if something is missing)

Updated types

Current approach, scripts and data

Steps Make sure u use versioned data from v0.4.3 as later versions may still incomplete #318 !

Questions

@Ludee: If you have comments or amendments, please state here asap.

nesnoj commented 6 years ago

Ok, due to available time we agreed on a minimal version:

Part 1

Part 2

Ludee commented 6 years ago

Just checked and M1-1 for biogas is completed and data is valid! Now more methods can be created "quite" easy. I wish you good luck

christian-rli commented 6 years ago

M3b is complete and produces valid data. The latest commit includes a readme-file that describes setup, M1-1 and M3b. Hope it helps to understand the sql-snippets.

nesnoj commented 6 years ago

Thanks a lot @christian-rli ! I'll have a look on the data.

The next step is to use the new OPSD dataset by @wolfbunke. I expect some scripts from eGo to be reused. I'll get back to you..