We've been using LP-ETL for some months at Mines Saint-Étienne, with quite satisfactory results. We are having what seems to be a scalability issue, though: we get a StackOverflow error with a simple Mustache template applied on a RDF dataset that includes about 10k triples.
Here is a minimal example. Our input dataset includes triples of the form:
ex:Concept1 a skos:Concept ; skos:prefLabel "Concept 1" .
ex:Concept2 a skos:Concept ; skos:prefLabel "Concept 2" .
ex:Concept3 a skos:Concept ; skos:prefLabel "Concept 3" .
# etc. until ex:Concept10000
The template is as follows (applied on instances of skos:Concept):
When running a pipeline on that dataset and template, we get the following error:
2021-09-01 07:15:16,720 [asynchExecutor-1] ERROR c.l.e.e.e.ExecutionObserver - onExecuteComponentFailed : http://localhost:8080/resources/pipelines/1629976398424/component/f063-89f7
com.linkedpipes.etl.executor.ExecutorException: PipelineComponent execution failed.
at com.linkedpipes.etl.executor.component.SequentialComponentExecutor.run(SequentialComponentExecutor.java:42)
at java.base/java.lang.Thread.run(Thread.java:830)
Caused by: com.linkedpipes.etl.executor.api.v1.LpException: Execution failed.
at com.linkedpipes.etl.executor.api.v1.component.SequentialWrap.execute(SequentialWrap.java:51)
at com.linkedpipes.etl.executor.component.SequentialComponentExecutor.run(SequentialComponentExecutor.java:38)
... 1 common frames omitted
Caused by: java.lang.StackOverflowError: null
at java.base/java.util.HashMap.putVal(HashMap.java:629)
at java.base/java.util.HashMap.put(HashMap.java:612)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.buildEmptyDataObject(DataObjectLoader.java:132)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.createDataObject(DataObjectLoader.java:125)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.transformResource(DataObjectLoader.java:172)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.buildNonEmptyDataObject(DataObjectLoader.java:145)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.createDataObject(DataObjectLoader.java:127)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.transformResource(DataObjectLoader.java:172)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.buildNonEmptyDataObject(DataObjectLoader.java:145)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.createDataObject(DataObjectLoader.java:127)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.transformResource(DataObjectLoader.java:172)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.buildNonEmptyDataObject(DataObjectLoader.java:145)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.createDataObject(DataObjectLoader.java:127)
at com.linkedpipes.plugin.transformer.mustache.DataObjectLoader.transformResource(DataObjectLoader.java:172)
...
We've been using LP-ETL for some months at Mines Saint-Étienne, with quite satisfactory results. We are having what seems to be a scalability issue, though: we get a StackOverflow error with a simple Mustache template applied on a RDF dataset that includes about 10k triples.
Here is a minimal example. Our input dataset includes triples of the form:
The template is as follows (applied on instances of
skos:Concept
):When running a pipeline on that dataset and template, we get the following error: