aurae-runtime / aurae

Distributed systems runtime daemon written in Rust.
https://aurae.io
Apache License 2.0
1.84k stars 91 forks source link

Integration tests for auraescript examples #365

Closed JeroenSoeters closed 1 year ago

JeroenSoeters commented 1 year ago

It would be nice if we had a test harness around the auraescript examples we ship. What if we would add comments with the expected output in the example TypeScript files and we'd write a test runner that would parse all those comments, builds an array of expected JSON output and compares that against the actual output of the script. Like this:

let cells = new runtime.CellServiceClient();
const cellName = "ae-sleeper-cell";
const nestedCellName = "ae-sleeper-cell/nested-sleeper"

// [ Allocate ]
let allocated = await cells.allocate(<runtime.CellServiceAllocateRequest>{
    cell: runtime.Cell.fromPartial({
        name: cellName,
        cpu: runtime.CpuController.fromPartial({
            weight: 2, // Percentage of CPUs
            max: 400 * (10 ** 3), // 0.4 seconds in microseconds
        }),
    })
});
helpers.print(allocated)

// output
// {
//      "cellName": "ae-sleepercell",
//      "v2": true
// }

// [ Start ]
let started = await cells.start(<runtime.CellServiceStartRequest>{
    cellName,
    executable: runtime.Executable.fromPartial({
        command: "/usr/bin/sleep 42",
        description: "Sleep for 42 seconds",
        name: "sleep-42"
    })
})
helpers.print(started)

//  output
//  {
//    "pid": "<pid>"
//  }

With "key" : " value" we mean assert that the key exists in the output and assert that the values are equal. With "key" : "<value>" we indicate that we expect the key to be present and has any value. We could create a ... pattern for streams when we support them:

// output
// [
//     {
//        "some": "stream"
//     },
//     ...   
// ] 
dmah42 commented 1 year ago

i'm not convinced we need to include the expected output in the script. what i would expect is a framework that allows us to:

  1. start the auraed daemon
  2. run a typescript script, capturing the output
  3. check the output against expectations

given how broadly-used typescript is, i'm a bit surprised there isn't a framework already for this..

JeroenSoeters commented 1 year ago

I should probably have been more clear in that what I am proposing is exactly that :) It's just a matter of where we want the expectations to live. Here I'm inlining those expectations with the example scripts, which has the advantage that you only have to edit a single file which is your example and your test. We could automatically run the framework against every file in the examples folder. Also, if you're just browsing the repo, you don't necessarily have to run the examples to see what output they produce.

OTOH it might be non-obvious that this is happening, and one could argue that the inlined output distracts from the actual example.

I don't really have a strong preference for one or the other.

future-highway commented 1 year ago

95% sure this is the crate I've used in the past, that may be helpful: https://crates.io/crates/assert-json-diff

JeroenSoeters commented 1 year ago

I think it would be good to define what we actually want in terms of test coverage:

  1. ensure every API endpoint works correctly from Auraescript
  2. ensure all the scripts execute correctly

For (1), it would be better to have a regular test suite, one test per endpoint. For (2), we could consider above mentioned approach with comments in the scripts.

dmah42 commented 1 year ago

I'm mostly interested in 1 to catch regressions. 2 is fine but if we have 1 it should be low priority.

I'm also worried that if we have 2 someone will suggest that all our tests should be written in typescript and that will be a whole Thing.

JeroenSoeters commented 1 year ago

In that case, should we write a set of end-to-end tests against the aurae-client? I feel that would cover most of the concerns. The "deno ops" (or whatever these things are called) are anyways exercising the aurae-client and anything up to that point is auto-generated anyways so should be pretty stable.

JeroenSoeters commented 1 year ago

It would also be a more natural place for these "sort-of-end-to-end-tests" I have been writing and sticking in the cells and observe gRPC services code up until now.

JeroenSoeters commented 1 year ago

And it would help us drive this sort of stuff from a test: https://github.com/aurae-runtime/aurae/pull/386

JeroenSoeters commented 1 year ago

I have started to take a stab at this as part of this PR, as it is becoming increasingly difficult to test scenarios that involve a couple of different APIs. If this looks good, I will move that single test from the cell_service (test_list) over to this crate as well and introduce some helpers/builders to make these tests easier to write and read. Also, we need to start/stop auraed in the background for this test.

JeroenSoeters commented 1 year ago

Since we have an established pattern now for these integration tests I'm going to close this issue.