Closed roughani closed 8 years ago
Good question. My assumptions (all worth checking): a) a 'federating hub' should include a baseline set of data in HSDS format, b) it will be easier for this hub to share updates to / from systems that have 'mapped' their internal schema to HSDS, and also c) yes, i'm told that writing scripts to automate translation (in and out) is the easy way to achieve this interoperability.
We do have some such scripts somewhere to serve as examples, although they may be for pre-1.0 versions of HSDS...
This is really a question for Greg, but anyone's welcome to chime in. We're recommending a federated approach that will require a core application to consume community resource directory data from a network of open data portals and 2-1-1 systems. So it occurs to me that the source data (esp. from open data portals) does not need to already be in HSDS format. In other words, why require the extra workload on staff instead of writing a script that automates transformation of the source data and maps the fields into HSDS structure? This is also in light of the relational structure of HSDS vs. the typical flat file structure of open data portals.