Open brendanzab opened 6 years ago
Might be interesting to look at the Amethyst scripting API proposal for another perspective on what a client embedding Pikelet might want: https://github.com/amethyst/rfcs/pull/1
Datafrog has a nice approach to evaluation, in that it leaves stepping the runtime up client of the library. This could be handy for embedding.
Also kind of interesting to have a look at the The Lua-C API for inspiration... 🤔
So I havn't read up on this stuff all that much. But the kind of thing I'd imagine for such a loader API/language server interface is much simplier:
Assert new info:
A definition can be:
Query info:
You can think of a tool supporting this API as calling parts of the compiler as a library call. The compiler itself may have a different interface (so it can do more optimisations / compile multiple definitions at once). Things like "typecheck this/compile this" are the know definition call. And things such as rename symbol and autoformat are not actually handled here, they're handled by whatever is making these calls such as an editor. Eg: rename symbol might do a forget then a know. Editors still work with text files which is the canonical version of the program, it just updates this online version to support editor functions/a repl, but can be thrown away at any time.
Maybe you could add +reload definition+ and handle that by also re-compiling anything transitively that referred to that symbol.
Thanks for sharing your thoughts @PaulBone! Some nice food for thought!
jonathandturner/rhai looks like it has a nice API for this stuff!
One nice thing is how you can register functions:
extern crate rhai;
use rhai::{Engine, RegisterFn};
fn add(x: i64, y: i64) -> i64 {
x + y
}
fn main() {
let mut engine = Engine::new();
engine.register_fn("add", add);
if let Ok(result) = engine.eval::<i64>("add(40, 2)") {
println!("Answer: {}", result); // prints 42
}
}
And types as well:
extern crate rhai;
use rhai::{Engine, RegisterFn};
#[derive(Clone)]
struct TestStruct {
x: i64
}
impl TestStruct {
fn update(&mut self) {
self.x += 1000;
}
fn new() -> TestStruct {
TestStruct { x: 1 }
}
}
fn main() {
let mut engine = Engine::new();
engine.register_type::<TestStruct>();
engine.register_fn("update", TestStruct::update);
engine.register_fn("new_ts", TestStruct::new);
if let Ok(result) = engine.eval::<TestStruct>("let x = new_ts(); x.update(); x") {
println!("result: {}", result.x); // prints 1001
}
}
This could help reduce the current mess we have in pikelet_elablorate::context
.
Gluon also has nice embnedding and marshalling APIs that might be worth drawing inspiration from.
One issue that comes to mind is how closures might be handled, passing from Rust into Pikelet. Currently we only have an interpreter, so we can actually call closures during normalization. But eventually we'll want to have a JIT, and a code generator, so that might not be possible. This would most likely limit our ability to support things like rhai's Engine::register_fn
API.
Here's a nice list of embeddable languages that we might be able to get inspiration from (thanks @photex!).
I'm kind of feeling that you might have disjoint concerns when embedding with a JIT vs compiling to native code -in the former case you want to register Rust types and data/closures with the VM, and in the latter there you might want to statically/dynamically link to a Rust or a C library. It's kind of tricky to support both. 🤔
Chatted to the peeps on the Cranelift Gitter, and got some nice responses! @sunfishcode says:
For passing closures into JIT'd code, the first version of this will look like: at the machine code level, you pass a pointer to the function in, and call it indirectly, passing in pointers to its data. Pretty low-tech to start with. But there are people working on building Cranelift-based Rust backends, which should open up more options in the future.
They might be able to do some work to make this easier though, which would be neat!
Thought I'd point out https://github.com/gluon-lang/gluon/blob/master/src/compiler_pipeline.rs . It is not perfect by any stretch but it has worked out quite well in gluon.
High level overview is that It defines a trait for each compile step.
MacroExpandable
Renameable
MetadataExtractable
InfixReparseable
...
Each trait takes Self
as input and outputs it's own type on success MacroValue, Renamed, ...
then I add two implementations for the trait, one on the previous steps output (impl Renamed for MacroValue
and one as a blanket implementation on the previous trait Renameable for T where T: MacroExpandable
which just calls the previous step (yielding a MacroValue
in this example) and then calls the current step on that output.
All in all, this makes it quite easy to run only the compile steps up to step X so a formatter may only need to parse, while a language server needs to run up to typechecking but no further). It also makes it possible to inject logic between steps and then continue the compilation without worrying about a step being omitted.
Oh nice! I like this! This is cool too:
pub type SalvageResult<T> = Result<T, (Option<T>, Error)>;
My brain is running round in circles trying to design this in a vacuum, so I thought I'd sketch out some high level thoughts on this stuff. There are a bunch of interlocking concerns, which makes it a little hard to figure out how to make any headway on it.
Currently our loader/driver API lives in the
pikelet-driver
, but it leaves a lot to be desired. Ultimately we want a Rust API that maintains some incrementally accumulated state, and has an API with functions that give a nice way to:The Pikelet loader API would probably be consumed by the following clients:
Import paths may be:
Paths need to be followed in topological order, forming a DAG. We will want to be able to listen to the file system for updates, and incrementally update as needed.
We probably want to avoid baking in a heavy compiler back-end (like LLVM) at this level, although I also wouldn't rule out including a JIT (like CraneLift) for evaluating expressions at compile time.