thedatahub / Datahub-Factory

Datahub::Factory - Transport metadata between Collection Management Systems and the Datahub
Other
2 stars 4 forks source link

Add: configuration file support #4

Closed netsensei closed 7 years ago

netsensei commented 7 years ago

Command now take a boatload of options and arguments which could / should be encapsulated in one or multiple configuration files.

Given that we leverage Catmandu, we should research whether or not we could take advantage of store configuration support.

pieterdp commented 7 years ago

We could do that, but as almost all options depend on which storage/importer pair you use, I think command line switches (as we use now) are better. For sensitive information we could use ENV. However, it's still worth a look.

netsensei commented 7 years ago

If we leverage configuration files, it would be easier to deploy this tool via Puppet or Ansible.

We could create an Ansible template of the config file, and store app specific details in Ansibles' variables.yml file.

Deploying a new pipeline between an new API and the Datahub entails nothing more then adding the necessary context specific variables to the Puppet role and deploy that to the server. No need to log in directly into the server to make the necessary changes.

CLI options are still needed. But at this point we have to pass a ton of them to provide a specific context(OAuth, Importer, file, fix, etc.)

netsensei commented 7 years ago

Great. But we do need better documentation of the options you can put in a pipeline configuration and what they exactly do. Perhaps inline documentation in the example pipeline.ini?

Also, what happens if I don't add required options in my file? Will there be a comprehensible error message? Or just a perl output on stderr? Does the code account for those cases too?

pieterdp commented 7 years ago

Documentation and modules have been updated. Error handling is still an issue.

pieterdp commented 7 years ago

Error handling is fixed.