Open RaggedStaff opened 5 months ago
I think to move forward there are two main blockers for now.
2. An understanding of the production requirements for FDC Governance Circle, such as, where the triple store should be hosted, to what extent should it be integrated with the OFN UK instance, etc. This will help to narrow down the estimates for time and cost on Milestones 2 & 3 listed in the table above.
@jgaehring The triple store will be separate from all participating platforms. I'd have a preference to stand something up on the Infomaniak's Jelastic Cloud instance we're using to host the Shopify apps.
At this stage we aren't trying to integrate with anything... just (securely) store the data somewhere, so it can be managed, by the members, as their data commons in the future.
Lets have a quick chat about what might work...are you around tomorrow? I'm free 1-2pm or 4-4:30 (UK) .
On the other blocker - @lecoqlibre is on vacation this week, but I think around next week... maybe we should all talk together next week?
- I need to consult with Maxime to understand better how mixins work in assemblee-virtuelle/semantizer-typescript, so that something compatible can be included into the TS connector.
For my own sake, I'm just noting the snapshot of the semantizer's mixin implementation as it stands right now, although it is considered unstable:
@jgaehring Notes from our call:
We agreed to modify the export function(s) in the static area of connector-codegen (for ts, ruby & php), to check a parameter & if TRUE, we POST the exported JSONLD to our triple store.
We'll start with the PHP verion (Big Barn), then TS, then Ruby.
As discussed in today's tech call, this is the relevant part of the TypeScript codegen implementation (pending merge of PR #20) where the call to semantizer's .export()
method will be wrapped with the Data Capture logic, which basically just needs to call .export()
again with the new destination:
That "wrapper" can be moved lower down the stack to the internals of the semantizer, once it reaches its next stable release, but that later change shouldn't require breaking changes to either the connector or the semantizer's APIs. Therefore, I believe there should be no problem implementing the data capture feature with the existing alpha version of the semantizer, since costs prohibit that being upgraded in the near future regardless, without incurring significant tech debt once the stable release becomes available.
What do you think about using the observer pattern to decouple the data-capture feature from the connector itself?
We would have a method to register a new observer for the export method like connector.registerCallbackForExport(callback: (exported: string) => void)
.
Each time the connector.export()
method will be called, a new callback will be triggered. This mechanism can be used for any other export-related feature.
In the client code you want to capture data from, you will just have register a handler of your choice (which can be implemented is a separated package and even in a DFC related one if you want like @datafoodconsortium/connector-data-capture
).
You can also export a pre-configured Connector
class from this package so your clients can just import it without configure it:
import { Connector } from "@datafoodconsortium/connector-data-capture";
const connector = new Connector();
connector.export(...); // this will trigger the data-capture handler
@jgaehring @RaggedStaff
Discussed in https://github.com/orgs/datafoodconsortium/discussions/30
Further discussions have highlighted that the Semantizer libraries are having functionality upgraded to support mixins. This is a dependency for this work: the Data Capture functionality will be included as a mixin.