-
For every SPARQL extraction (and also SPARQL transformation and LOADER), there must be log.debug containing the request send to the SPARQL endpoint. This is missing.
This is needed so that I can debu…
-
Page:
https://grips.semantic-web.at/display/UNVI/Core+DPUs
Describe all available core DPUs, Describe an idea of the implementation.
-
-
Motivation:
I have a SPARQL query, which CONSTRUCTS certain data X based on named graph A. However, for the construction of the data X, another named graph B (another input) is needed. However I cann…
-
Actually when we used SPARQL transformer we merge/add data to all inputs (input, optional1,...optional3), but when this data are large collection and we used only one (or two) inputs of that - we don´…
-
Aniž bych něco měnil dostal jsem ..
nejsem si zrovna jistý věcmi jako
```
for (RDFDataUnit repository : inputs) {
URI dataGraphURI = repository.getDataGraph();
dataSet.addDefaultGraph(dataGraphURI…
-
I tried to test on http://odcs.xrg.cz:8080/odcleanstore-test
I created new pipeline: file extractor ->sparql transformer -> file loader and tried to run it. I get failed execution because of error pa…
-
Vyhodit this.context = new TestContext();
Pockej, ale takove nastaveni nemuze byt v produkcnim kodu!! Pouzij TestEnvironment co pripravil Petr. Viz tutorial [1] a nebo se inspiruj u testu na SPARQL t…
-
Consult the documentation in intlib/documentation creation of dpus. Or Petr should know.
For SPARQL/RDF LOADER, when the dpu is cancelled, all generated data should be deleted. At least you should t…
-
(on revision 13d32633)
```
-------------------------------------------------------------------------------
Test set: cz.cuni.mff.xrg.odcs.transformer.SPARQL.SPARQLTest
-------------------------------…