swiftlang / swift-llbuild

A low-level build system, used by Xcode and the Swift Package Manager
Apache License 2.0
1.09k stars 198 forks source link

[SR-6053] [llbuild] Add support for IPC-based extension of BuildSystem + client library #792

Open ddunbar opened 7 years ago

ddunbar commented 7 years ago
Previous ID SR-6053
Radar None
Original Reporter @ddunbar
Type New Feature
Additional Detail from JIRA | | | |------------------|-----------------| |Votes | 1 | |Component/s | llbuild | |Labels | New Feature | |Assignee | None | |Priority | Medium | md5: 57ae6f0cdc4c06e14d2ae85590a1e6b3

Issue Description:

Currently, llbuild is intended to be used in a way where all build tasks are directly exposed to the build engine running within a single process.

This is problematic when working with existing build system designs which depend on running out-of-process build tools. The traditional way we have thought about solving this was by turning each tool individually into library-based tool (potentially with its own IPC boundary) which we could then interact with more directly. However, this is difficult and disruptive.

As an alternative, we should extend llbuild to expose an IPC protocol to subprocesses (e.g., an environment variable connected to a named pipe or Unix domain socket), and a new library which clients can link against.

This library would expose the following functionality:

Under the covers, this would work by shuttling this traffic over the IPC pipe to the active build engine, which would then translate that into native APIs. Similary, when rule requests appear they would be forwarded to any actively running clients which had registered support for that rule domain (we could probably use a simple prefix matching scheme for routing).

The benefit of this approach is that it is quite easy to integrate with existing tools, and in such a way that the tool transparently uses llbuild if available, but if not simply does the work itself.

For example, consider a tool which does some computation, where one part is easily cached and may be shared with many things in the build:


func foo(name: String, input: Data) {
   let z = computeSomeProperty(input)
   ... do more work ...
}

then, assume the tool can unambiguous identify this computation (give it a name), we could write something like:

import llbuild

func init() {
  // Register rule callbacks.
  llbuild.registerRule(prefix: "ComputeSomeProperty-") { (rule, engine) in
    let name = ... extract name from rule ...
    engine.requiresInput(name) { input in
      let output = computeSomeProperty(data)
      engine.provideOutput(output)
    }
  }
}

func foo(name: String, input: Data) {
  // This will block (and notify the engine we are blocked) until available.
  let output = llbuild.computeRule("ComputeSomeProperty-\(name)")
  ... do more work ...
}

When run in a build, the engine will now only schedule this task to run once, after that it will be cached (in memory) and stored in the database. Any subsequent requests for the same value will be vended directly. We can make the library convenient to use so that if llbuild isn't connected, we just do the work as usual with minimal overhead.

ddunbar commented 4 years ago

While it never landed, for posterity I did at one point start working on this here: https://github.com/ddunbar/swift-llbuild/tree/SR-6053

ddunbar commented 4 years ago

This also has some (loose) similarities to the Linda programming language: https://en.wikipedia.org/wiki/Linda\_(coordination_language)