FAIRDataTeam / FAIRifier

A tool to make data FAIR
MIT License
14 stars 8 forks source link

Batch execution of CSV to RDF conversion #2

Open mikel-egana-aranguren opened 7 years ago

mikel-egana-aranguren commented 7 years ago

We are working with the government in a Linked Open Data project in which they need to convert CSVs to RDF (https://github.com/opendata-euskadi). Many of the CSVs present structures that don't change or change very little (only the cell values are updated), so the requirement is to define a mapping once and execute it in batch mode every time there is a data update. Is this possible in FAIRifier?

More specifically, what I need is to be able to execute the history JSON, in which I have defined the RDF skeleton (graphically), programmatically, against a given CSV.

Thanks

PS: The other solutions we are considering are Grafter (http://grafter.org/) and OntoText's OntoRefine, which I really like due to the fact that the conversion is defined as a SPARQL INSERT query against a temporary SPARQL endpoint representing the "raw" CSV data as RDF (also based in Open Refine). Executing OntoRefine programmatically is not straight forward though.

Shamanou commented 6 years ago

This feature is in the works, try the development version, you will be able to record you rdf skeleton and execute the changes on a new file by exporting the json and importing it into the new project

mikel-egana-aranguren commented 6 years ago

Is there a way of doing it programmatically? We will have plenty of different CSVs, each of them with a defined skeleton, and we need to execute those conversions periodically through our Java platform. We can't use the GUI.