Open timrdf opened 12 years ago
From Josh:
As for Ripple, yes, there are various ways to connect REST services to Ripple, but let's get a little clearer on integration. Input to Ripple one or more RDF lists representing queries, and the output of Ripple is zero or more RDF lists representing solutions to the query. What you want, as far as I understand, is a service whose input is an RDF dataset and whose output is another RDF dataset, so the question is how to map lists<-->datasets appropriately for the application.
One possibility: if you give Ripple a dataset which contains all of the rdf:first and rdf:rest statements of a list, that's all it needs. With a little application-specific logic, we could identify the head of the list. We could then map a set of lists into an RDF dataset in the same way (just include all of their rdf:first and rdf:rest statements), if that is something you could consume at the other end. Or if you define a result ontology of some kind, the query results could be expressed in terms of the ontology.
Sound reasonable?
Josh
We met and whiteboarded 2012-02-20-whiteboard.JPG
.
Josh is awesome and offered to write a "hello world" Ripple FAqT service.
It will run in java, use Restlet or (another one). When run, it will deploy a service that accepts HTTP POST of RDF/XML or Turtle, grab a void:Dataset URI from that graph, and call Ripple function:
Collection<RippleList> doRippleQuery (Sail cache, String query);
In the void hierarchy example, submit the two queries to get the % coverage. Then dump the cache in the HTTP response.
In a week or two, b/c of CSHALS. :-)
Ripple is spectacular at crawling Linked Data. It has some strengths over python/SuRF/rdflib, which is the first way to implement a FAqT service. Can we add Ripple as an alternative way to provide FAqT Services?
https://github.com/timrdf/DataFAQs/wiki/FAqT-Service-using-Ripple