ucd-cws / calvin-network-tools

Command line tools for calvin-network-data and calvin-network-app. Includes prm tool for export PRI and DSS files
MIT License
3 stars 5 forks source link

Should we add an overview network.geojson file? #19

Closed qjhart closed 8 years ago

qjhart commented 8 years ago

Rather than use a directory as our starting point, I wonder if we could instead use a geojson file as our starting point. In that case, we would have something like:

{ "type": "GeometryCollection",
    "geometries": [
      {"$ref":'data/sr/c100/node.geojson'},
      {"$ref":'data/sr/c100/node.geojson'},
      {"$ref":'data/sr/c100/node.geojson'},
      ...
     {"$ref":'data/sr/c100-d100/link.geojson'},
     ...
     {$ref:'data/sr/region.geojson'}
     ...
    ]
  }

This would allow us three things. We could organize the directories however we want, and we can have mulitple networks in a single branch by having different network configurations.

One drawback I see, is that maybe we don't want to include regions in the network at all, maybe we want to have that as a seperate presentation layer.

qjhart commented 8 years ago

In that case, we'd have a new set of tools to build that file as well, as a helper I suppose.

jrmerz commented 8 years ago

+1

I like were you are going with this. Starting with the regions, they are a presentation layer and should be moved. Perhaps considered part of the app, ie placed in the app repo? I think the GeometryCollection (though I think you mean FeatureCollection?) would be great for us and flexible for a general use model. At that point the 'crawl' step would always be a simple 3 phase:

1) Read FeatureCollection 2) Read $ref nodes/links from collection 3) Optional. Read $ref csv/data files. (side question, does the $ref spec have a 'type' we can associate?).

qjhart commented 8 years ago

The $ref thing, I only see well used in the JSON schema :(. As a result, I think we are off on our own a bit, and could easily add a type as well.

jrmerz commented 8 years ago

we could simply impose strict file extensions.

qjhart commented 8 years ago

Super. However, I think in this case we need to think about our prmnames. I think we may need to be more serious about our identifiers, instead of just hoping that we have prmnames that are unique?

jrmerz commented 8 years ago

Sounds like a task for the new crawler. To understand:

1) what is the UID field (if we don't make it standard). 2) Complain about duplicates.

qjhart commented 8 years ago

Yes, exactly. I think we should have two items: a uuid which is a url, and a shorthand id as well. The uuid, could potentially be a JSON reference, and the id should be unique to the network. I don't think we want to limit the size of the uid, but if we don't we need an additional stop to convert it to be allowable in an hecprm run. I'm not sure the best way to do that.

Interesting about the crawler. I do think we need another non-prm specific tool, Not sure /usr/local/bin/crawler is great. There is overlap with prm, for example, I have some editing ideas in place that maybe belong in both, but that may be confusing.