Open rainhead opened 6 years ago
Hi Peter!
That depends what kind of roadmap you’re looking for!
In brief: the team is aiming for initial production deployment in Q2, and some amount of automated syncing in Q3.
The API is not yet stable, and not deployed to crates.io for that reason, but the changes should be relatively easy to follow if you choose to take an early dependency. For example, the query result structs will be growing soon to support structured query output.
Peter's co-conspirator here—that sounds great! A question related to our use case, which I haven't dug deep enough on to answer for myself: (is there already) / (are there plans for) a within-Rust Datalog-like DSL via macros?
Peter's co-conspirator here—that sounds great! A question related to our use case, which I haven't dug deep enough on to answer for myself: (is there already) / (are there plans for) a within-Rust Datalog-like DSL via macros?
No, not really, and I think that would be hard. Our pipeline for turning Datalog into SQL is staged, but we get fairly quickly to wanting to turn idents (:key/word
) into entids (1000), and it's not easy to stage that smoothly at compile time. But we (I!) haven't really thought about this very much; if you think this is natural/not hard, please do tell!
We do have macrology for things like var!(:foo/bar)
.
It makes sense to have a compiler stage for parsing queries — the parsed representation of the query is not linked to a particular store, so it's possible to do something like:
store.q_once(q!([:find ?x :where [?x :foo/bar 15]]), None)
and get compile-time error reporting. But doing so means running code with a compiler plugin, or writing really complicated macro code.
Patches welcome!
Cool, that makes sense—certainly you can't do things at compile time that depend on the contents of the datastore. I was imagining a syntax for Rust clients using Mentat directly that would look similar to what the query string an external client would send. So, just like your example.
I'm new enough to Rust and especially Datalog that this might be way off...
But could you not have an intermediate thing, built entirely with pattern-matching macros, that would convert your example into a tree of function calls that evaluated to a query object? And that would fail to compile—because of failed pattern matching and/or the type signatures of function calls—if the syntax was bad, albeit with potentially hard to understand errors?
But could you not have an intermediate thing, built entirely with pattern-matching macros, that would convert your example into a tree of function calls that evaluated to a query object?
Yes, that's what I mean by "really complicated macro code".
It essentially means implementing the query parser — complete with necessary validation and checking, nesting, EDN syntax including #inst
and other annotations, sub-parsing of keywords, etc. in Rust macrology, which I expect is a lot of work.
We already know that it's possible to use arbitrary code via #![feature(proc_macro)]
— @victorporof did it for rsx — so it's presumably easier to reuse the existing combine
-based parser than to try to implement the equivalent in regular Rust macros.
It's worth noting that there are only three reasons to do this:
q_prepare
anyway, which does the parsing, algebrizing, and SQL query preparation once, ready for execution multiple times.I don't think there's significant ergonomic improvement in:
store.q_once(q!([:find ?x :where [?x _ _]])…)
versus
store.q_once("[:find ?x :where [?x _ _]]"…)
I don’t know that much about how things work, but I tend to agree with @rnewman that in this case procedural macros (aka compiler plugins 2.0) don’t seem to add much benefit here in terms of optimizations.
A key improvement might be with ergonomics, where it would be possible to intermix rust variable names or even expressions with edn, and avoid strings for defining queries altogether, a la LINQ. Not sure if that’s desirable.
Thanks for all the responses to this tangent—I think i'm convinced that reimplementing Datalog in Rust macros might be convenient but probably not worth the effort. :) Looking forward to using Mentat!
Thanks for all the responses to this tangent—I think i'm convinced that reimplementing Datalog in Rust macros might be convenient but probably not worth the effort. :) Looking forward to using Mentat!
Thanks for your interest! Any information about your project and why you think Mentat might be a good fit would be appreciated.
Thanks for all that! It'll be a while before we can really focus on our own project, so that timeline suits us fine.
We're interested in building tools, and a platform for building those tools, for communication and collaboration. To start with we'd like to replace Slack in our lives, with features to better help a group like our Meetup (https://www.meetup.com/Future-of-Programming/) develop and track ideas. We'd like for these tools to be decentralized, working over whatever connections are available.
I've been following BOOM and its progeny since the BLOOM paper, and have bought into the CALM + Datalog approach enough to want to try it in something real.
Hi there,
I'm excited for this project, and wanted to ask whether it was stable enough to use outside of Mozilla. Is there a roadmap, by chance?
Thanks!