Closed dmichon-msft closed 1 year ago
Very nice proposal.
Will phases be able to be priority ranked?
Default prioritization would match the algorithm used by Rush, i.e.: 1) Sort by longest chain of dependent operations, descending 2) Then by count of immediate dependent operations, descending However, that could be extended with additional hooks.
We did a design review recently on this topic, building off the original ideas from @dmichon-msft. The changes are pretty involved and would require re-writing all Heft plugins, but it does lead to Heft being a fully generalized, developer-defined set of stages that constitute a Heft action.
The heft.json
file becomes the place where the definition of Heft actions and stages will live. This largely re-uses the existing schema for the definition of stages, while introducing the concept of developer-defined (rather than hardcoded, Heft-defined) actions.
{
"$schema": "https://developer.microsoft.com/json-schemas/heft/heft.schema.json",
"extends": "base-project/config/heft.json",
"actions": {
"build": {
"stages": [
"sass",
"compile"
],
"actionPlugins": [
{
"packageName": "@rushstack/my-action-plugin"
}
]
}
},
"stages": {
"sass": {
"stagePlugins": [
{
"packageName": "@rushstack/heft-sass-plugin"
"options": { ... }
}
]
},
"compile": {
"dependsOn": [
"sass"
],
"stageEvents": [
{
"eventKind": "copyFiles",
"eventHook": "run",
"eventId": "copyImages",
"copyOperations": [ … ]
}
],
"stagePlugins": [
{
"packageName": "@rushstack/heft-typescript-plugin",
"pluginName": "TypescriptPlugin"
}
]
}
}
}
One notable schema difference that is applicable to both actions
and stages
is that the property is now a keyed object rather than an array of objects. This is done to allow for clearer and more controllable mergeability. We currently use @rushstack/heft-config-file
to provide config file extends
functionality. While this works, there are limitations in that it currently defaults to two merge behaviors:
With Heft becoming completely developer extensible, we will need to allow for more advanced merge behaviors to allow modifying defined actions/stages to extend off previously defined values to add stages or plugins. This could be done by using inline markup properties that define merge behavior. For the below example, assume that we are extending a file with a previously defined "property1" value that is a keyed object, and a "property2" value that is an array object:
{
"$schema": "..."
"$extends": "..."
"$property1.mergeBehavior": "override | merge",
"property1": {
"$subProperty1.mergeBehavior": "override | merge",
"subProperty1": { ... },
"$subProperty2.mergeBehavior": "override | append"
"subProperty2": [ ... ]
},
"$property2.mergeBehavior": "override | append",
"property2": [ ... ]
}
The default mergeBehavior
for keyed objects (root level or otherwise) will continue to be override
to maintain compatibility with existing behavior, however the depth of merges can be easily tweaked by chaining use of the merge
mergeBehavior
into sub-properties. Additionally, array objects (root level or otherwise) will continue to append
by default, but now have the ability to be set to override
to completely replace the existing array, something that was not previously possible. Once an object is set to a mergeBehavior
of override
, all sub-property mergeBehavior
values will be ignored, since the top-most object already overrides all sub-properties.
One thing to note is that different mergeBehavior
verbs are used for the merging of keyed objects and arrays. This is to make it explicit that arrays will be appended as-is, and no additional processing (eg. deduping if the array is intended to be a set) is done during merge. If such behavior is required, it can be done on the implementation side. Deduping arrays within the @rushstack/heft-config-file
package doesn't quite make sense, since deduping arrays of non-primitive objects is not easily defined.
Heft actions define the action that is run when calling heft <action>
.
"actions": {
"build": {
"stages": [
"sass",
"compile"
],
"actionPlugins": [
{
"packageName": "@rushstack/my-action-plugin"
}
]
}
},
Heft actions
specify all stages that are run as part of executing the action. The order in which these stages execute is determined by the dependency graph generated from the defined stages
property (similar to how Rush phases define dependency order). Due to the breakdown of Heft actions into stages, this would enable us to do scoped Heft executions, such that we could run (for example) heft build --to compile
, only loading up and running the stages that are contained within this scope.
Actions also define actionPlugins
, which have access to a set of action-scoped hooks. If no pluginName
is provided for an actionPlugin
and the plugin package only defines one action plugin, Heft will default to loading the only specified action plugin. Otherwise, Heft will throw, since it is unable to determine which plugin should be used.
Hooks that actionPlugins
have access to would include:
Metrics must be handled at the action level, since stages will be isolated and not maintain context between stage executions.
Heft stages are the building blocks of Heft actions, and define what gets executed when a stage is run, and in what order.
"stages": {
"sass": {
"stagePlugins": [
{
"packageName": "@rushstack/heft-sass-plugin"
"options": { ... }
}
]
},
"compile": {
"dependsOn": [
"sass"
],
"stageEvents": [
{
"eventKind": "copyFiles",
"eventHook": "run",
"eventId": "copyImages",
"copyOperations": [ … ]
}
],
"stagePlugins": [
{
"packageName": "@rushstack/heft-typescript-plugin",
"pluginName": "TypescriptPlugin"
}
]
}
}
These stages contain most of what heft.json originally contained. Each stage is isolated and can only access hooks and plugins from within the stage that defines them. Hooks available on a stage would be:
Interaction between plugins within a stage, such as the use of plugin-provided sub-stage hooks, would now be done via the use of the existing Heft plugin accessor API. For example: Plugin01.ts
export interface IPlugin01Accessor {
hooks: {
afterRun: SyncHook<string>
};
}
export const PluginName: string = 'Plugin01';
export class Plugin01 implements IHeftPlugin {
public pluginName: string = PluginName;
private _accessor: IPlugin01Accessor;
public get accessor(): IPlugin01Accessor {
return this._accessor;
}
public apply(session: HeftSession, configuration: HeftConfiguration): void {
this._accessor = {
hooks: {
afterRun: new SyncHook<string>()
}
};
session.hooks.run.tapPromise(
this.pluginName,
async (cxt: IStageContext) => {
this.accessor.hooks.afterRun.call('some value');
}
);
}
}
export default new Plugin01();
Plugin02.ts
import { IPlugin01Accessor, PluginName as Plugin01Name } from 'plugin01'
export const PluginName: string = 'Plugin02';
export class Plugin02 implements IHeftPlugin {
public pluginName: string = PluginName;
public apply(session: HeftSession, configuration: HeftConfiguration): void {
// NOTE: It is best to place all implementation within the hooks that
// are provided by the session or by the plugin accessors. The session is
// not guaranteed to be fresh in the case of IPC, thus 'apply' may not have
// been called between multiple invocations
session.requestAccessToPluginByName(
Plugin01Name,
(accessor: IPlugin01Accessor) => {
accessor.hooks.afterRun.tap(this.pluginName, (result: string) => {
// Writes 'some value'
console.log(result);
});
}
);
}
}
export default new Plugin02();
The use of this style of sub-stage hooks allows us to eliminate the use of arbitrary stages that Heft currently defines for specific use cases (ex. afterConfigureWebpack
in the current Heft build stage).
Heft events are a way to access built-in Heft utility plugins during stage execution.
"stageEvents": [
{
"eventKind": "copyFiles",
"eventHook": "run",
"eventId": "copyImages",
"copyOperations": [ … ]
}
]
eventActions
have been renamed to stageEvents
to represent that the event execution now belongs to the stage. Likewise, the properties have been renamed to relate to the event itself. Additionally, the event is now triggered during execution of a specified stage event, provided by the eventHook
property. This allows developers to choose from any of the existing stage hooks to execute the event, rather than an arbitrary collection of stages that could be hooked into as it was previously.
The new heft-plugin.json file will be a required manifest file specified at the root of all external plugin packages.
{
"$schema": "https://developer.microsoft.com/json-schemas/heft/heft-plugin.schema.json",
"actionPlugins": [
{
"pluginName": "MyActionPlugin",
"optionsSchema": "../path/to/schema2.json",
"parameters": [
{
"parameterKind": "string",
"longName": "--my-string",
"description": "…",
"argumentName": "ARG_NAME",
"required": false
}
]
}
],
"stagePlugins": [
{
"pluginName": "MyStagePlugin",
"optionsSchema": "../path/to/schema1.json",
"parameters": [
{
"parameterKind": "string",
"longName": "--my-other-string",
"description": "…",
"argumentName": "ARG_NAME",
"required": false
}
]
}
]
}
This file provides us metadata about the plugins contained within the package. One of the main benefits is that it allows us to provide multiple plugins per package, and multiple types of plugins per package. It also allows us to obtain plugin-specific information, such as CLI parameters, without needing to require and apply()
the entire plugin. This makes it easy to implement a low-overhead way to provide CLI auto-complete for Heft actions, and obtain --help
information quickly. One additional benefit of this format is that it allows us to use the exact same spec to define parameters as how Rush defines parameters for Rush commands, making it easy and intuitive for developers who are already familiar with Rush. Lastly, options schemas can now be provided via this metadata file, which allows for validating the options
passed into a plugin in heft.json
, rather than leaving it up to the plugin to validate.
What are peoples thoughts on this? It is admittedly an extensive redesign of how Heft would work, but I think it's more true to the original idea of creating a general-purpose, multi-staged build tool that we had originally envisioned. Plus, it provides a significant number of benefits and simplifications to developers who design Heft plugins. It's hard to condense the thought processes that arrived at these design decisions down while still making this easily readable, so I've probably left a good chunk of that out. If you have any questions of course let me know!
We will want to provide a command-line tool to migrate the old heft.json
format to a functionally equivalent configuration in the new heft.json
. This might be done by defining the original set of default stages and mapping known plugins to the stages where they operate (e.g. webpack runs in a stage called "bundle", etc.).
After some further discussion, the design has morphed a bit and now simplifies terminology and more closely aligns with Rush phases. I wasn't quite sure how best to layout this comment, so you may have to bounce around the subheadings, but hopefully it gets the idea across well enough!
The heft.json file is where phases, tasks, plugins, and Heft events are defined, including dependencies between phases and tasks. The heftPlugins
field can be used to apply lifecycle plugins to Heft execution, or to apply task plugins which provide a task mapping in "heft-plugin.json". Additionally, the phasesByName
field can be used to more directly apply task plugins to a specific phase and task within an execution of Heft. Additionally, since plugin packages can now contain multiple plugins (described in "heft-plugin.json"). If no pluginName
is provided for a Heft plugin specified in "heft.json", and the plugin package only defines one plugin, Heft will default to loading the only specified plugin. Otherwise, Heft will throw, since it is unable to determine which plugin should be used.
Simple plugin specification will use a task-plugin-provided mapping onto the the rig used by the build. This mapping would be provided in heft-plugin.json
, which is supplied by the plugin. By default, all plugins provided for Rushstack will include a mapping intended to match the stock Heft rigs (@rushstack/heft-node-rig
and @rushstack/heft-web-rig
), however the plugin-provided mapping is not strictly bound to these rigs. As long as all phaseDependencies
and taskDependencies
that are specified exist in the rig, the plugin-provided functionality will be slotted into the build graph. For this reason, the phasesByName
field will be assembled before any of the plugins specified in heftPlugins
are mapped onto it.
{
"$schema": "https://developer.microsoft.com/json-schemas/heft/heft.schema.json",
"extends": "base-project/config/heft.json",
// "heftPlugins" can accept both task plugins and lifecycle plugins
"heftPlugins": [
{
"pluginPackage": "@rushstack/heft-typescript-plugin"
},
{
"pluginPackage": "@rushstack/heft-lint-plugin",
"pluginName": "eslint"
},
{
"pluginPackage": "@rushstack/heft-metrics-reporter"
}
]
}
If your Heft configuration requires more customized or in-depth tweaking, advanced plugin specification can be used to create a graph of every task plugin used by your build, while still using the heftPlugins
array for lifecycle plugins.
{
"$schema": "https://developer.microsoft.com/json-schemas/heft/heft.schema.json",
"extends": "base-project/config/heft.json",
// "heftPlugins" can be used alongside "phasesByName", and are still the correct location to
// specify lifecycle plugins
"heftPlugins": [
{
"pluginPackage": "@rushstack/heft-metrics-reporter"
}
]
"phasesByName": {
"generate-typings": {
"tasksByName": {
"generate-sass-typings": {
"taskPlugin": {
"pluginPackage": "@rushstack/heft-sass-plugin"
}
},
"generate-loc-typings": {
"taskPlugin": {
"pluginPackage": "heft-loc-plugin"
}
}
}
},
"verify-generate-docs": {
"phaseDependencies": [
"generate-typings"
],
"tasksByName": {
"typecheck": {
"taskPlugin": {
"pluginPackage": "@rushstack/heft-typescript-plugin",
"options": {
"mode": "typecheckAndEmitDts"
}
}
},
"lint": {
"asyncTaskDependencies": [
"typecheck"
],
"taskPlugin": {
"pluginPackage": "@rushstack/heft-lint-plugin",
"pluginName": "eslint"
}
},
"generate-docs": {
"asyncTaskDependencies": [
"typecheck"
],
"taskPlugin": {
"pluginPackage": "@rushstack/heft-api-extractor-plugin"
}
}
}
},
"compile": {
"tasksByName": {
"emit": {
"taskPlugin": {
"pluginPackage": "@rushstack/heft-typescript-plugin",
"options": {
"mode": "transpile"
}
}
},
"copy-assets": {
"taskEvent": {
"eventKind": "copyFiles",
"copyOperations": [
{
"sourceFolder": "src/assets",
"destinationFolders": [
"dist/assets"
]
}
]
}
}
}
},
"bundle": {
"phaseDependencies": [
"compile",
"generate-typings"
],
"tasksByName": {
"webpack": {
"taskPlugin": {
"pluginPackage": "@rushstack/heft-webpack5-plugin"
}
}
}
},
"test": {
"phaseDependencies": [
"compile"
],
"tasksByName": {
"jest": {
"taskPlugin": {
"pluginPackage": "@rushstack/heft-jest-plugin"
}
}
}
}
}
}
The phasesByName
uses the phase name as the key, and the same is true for tasksByName
. This is done to allow for clearer specification while allowing more controllable mergeability. Since we use @rushstack/heft-config-file to provide config file extends functionality, we can use explicitly defined inheritanceType
overrides, which were added in this PR: #3276
As described above, Heft will now have two separate concepts for plugins; lifecycle plugins, and task plugins. These two types of plugins will provide different functionality in different contexts, as described below.
Heft lifecycle plugins provide the implementation for certain lifecycle-related hooks. These plugins will be used across all Heft phases, and as such should be rarely used outside of a few specific cases (such as for metrics reporting). Additionally, Heft will throw if a lifecycle plugin is attempted to be loaded in the context of a task plugin (described below). Heft lifecycle plugins provide an apply
method, and here plugins can tap into the following hooks:
toolStart
toolStop
recordMetrics
When Heft spins up and after lifecycle plugins are loaded, the toolStart
hook will be called. Metrics must be handled at the lifecycle level, since tasks will be isolated and not maintain a shared context between task executions. Metrics data reported to a task will be passed back to Heft after the execution of a task. At the end of Heft execution and before teardown, Heft will call the toolStop
hook. It is here that (for example) collected metrics data would be flushed before Heft exits.
Heft task plugins provide the implementation for Heft tasks. Heft plugins provide an apply
method, and here plugins can hook into the following Tapable hooks:
clean
run
The clean
hook is used to provide a method to run cleanup for files that your plugin creates. The clean
hook is the first hook executed by Heft. The clean hook implementation should target created files as closely as possible, since "clean" will run after other tasks which may also generate files.
The run
hook is where most plugin functionality is provided. It can also return a ITaskResult
object, which can be used to provide information such as startTime
and endTime
overrides (which can be useful for plugins that use hooks from other plugins, and thus have a different execution timeline than that which can be measured within the run
hook).
Plugins can use the requestAccessToPluginByName
API to access hooks or data provided by other plugins declared as "async dependencies" (more info on that in the next section).
The new heft-plugin.json file will be a required manifest file specified at the root of all external plugin packages.
This file provides us metadata about the plugins contained within the package. This allows packages to provide multiple plugins, including multiple types of plugins. It also allows Heft to obtain plugin-specific information, such as CLI parameters, without needing to require and apply() the entire plugin. This makes it easy to implement a low-overhead way to provide CLI auto-complete for Heft actions, and obtain --help information quickly. One additional benefit of this format is that it allows us to use the exact same spec to define parameters as how Rush defines parameters for Rush commands, making it easy and intuitive for developers who are already familiar with Rush.
Options schemas can be provided via this metadata file, which allows for validating the options passed into a plugin in "heft.json", rather than leaving it up to the plugin to validate.
Task plugins can also provide a defaultPhaseMap
, which is used when a task plugin is specified in the heftPlugins
field in "heft.json". This allows for plugins to provide a mapping onto an existing phase and create their own custom tasks with optional dependencies on existing tasks. For now, this will be limited to mapping on-top of an existing phase specified within "heft.json" and taking dependencies on existing tasks, though this could hypothetically be expanded in the future to allow ad-hoc creation of custom phases and dependencies on custom tasks. This field will be ignored if using the plugin directly via phasesByName
.
{
"$schema": "https://developer.microsoft.com/json-schemas/heft/heft-plugin.schema.json",
"lifecyclePlugins": [
{
"pluginName": "MyLifecyclePlugin",
"optionsSchema": "./path/to/schema2.json",
"parameters": [
{
"parameterKind": "string",
"longName": "--my-string",
"description": "…",
"argumentName": "ARG_NAME",
"required": false
}
]
}
],
"taskPlugins": [
{
"pluginName": "MyTaskPlugin",
"optionsSchema": "./path/to/schema1.json",
"parameters": [
{
"parameterKind": "string",
"longName": "--my-other-string",
"description": "…",
"argumentName": "ARG_NAME",
"required": false
}
],
"defaultPhaseMap": {
"verify-generate-docs": {
"tasksByName": {
"my-custom-task": {
"asyncTaskDependencies": [
"typecheck"
]
}
}
}
}
}
]
}
Heft phases are a collection of tasks that must run within a phase. The order in which these tasks run is determined by both the task dependency tree and the phase dependency tree. Heft phases act as a logical collection of tasks that would reasonably (but not necessarily) map to a Rush phase. Once all tasks within a phase finish execution, dependent phases can begin running.
Using similar expansion logic to Rush, execution of a scope of Heft phases can be done through the use of the heft run
action. For example, heft run --only <phaseName>
. Similarly, running all phases up to and including a specified phase can be done using heft run --to <phaseName>
. This would also be equivalent to running heft <phaseName>
(Note: heft <phaseName>
implicitly expands all dependency phases into execution since this is the main use case for a developer. For example, if a developer is iterating while developing tests, they would likely want to run a build before their test run to ensure that the Jest plugin is operating on the latest version. In this way, heft test
would implicitly run the compile
phase as well as the test
phase (assuming the test
phase has a dependency on a compile
phase).
Heft tasks are the smallest unit of work specified in "heft.json". Tasks can either implement a single plugin, or a single Heft event.
Heft tasks can take dependencies on other tasks within the same phase, and all task dependencies must complete execution before dependent tasks can run. This defines the order of execution for the tasks within a phase.
Heft tasks can also take async dependencies on other tasks within the same phase. These are similar to normal task dependencies in order-of-execution, with the main difference being that specifying a task as an async dependency allows access to the task plugin via the requestAccessToPluginByName
API, since the task plugins for all async dependencies will be apply
'd at the same time. This allows for asynchronous running of code supplied by your plugin before executing it's clean
or run
hooks, or simply using the hook supplied by a plugin in requestAccessToPluginByName
to obtain information that would later be used by your plugin during its clean
or run
hooks.
For example, you can use this functionality to kick off an asynchronous process which can later be awaited to obtain a result, while the original plugin has no obligation to wait for this to finish:
export interface IPlugin01Accessor {
hooks: {
afterRun: AsyncParallelHook<string>
};
}
export const PluginName: string = 'Plugin01';
export class Plugin01 implements IHeftPlugin {
public pluginName: string = PluginName;
private _accessor: IPlugin01Accessor;
public get accessor(): IPlugin01Accessor {
return this._accessor;
}
public apply(session: HeftSession, configuration: HeftConfiguration): void {
this._accessor = {
hooks: {
afterRun: new SyncHook<string>()
}
};
session.hooks.run.tapPromise(
this.pluginName,
async (cxt: ITaskContext) => {
// Can be awaited or not
this.accessor.hooks.afterRun.callAsync('some value');
}
);
}
}
export default new Plugin01();
import { IPlugin01Accessor, PluginName as Plugin01Name } from 'plugin01'
export const PluginName: string = 'Plugin02';
export class Plugin02 implements IHeftPlugin {
public pluginName: string = PluginName;
private _afterRunPromise: Promise<string> | undefined;
public apply(session: HeftSession, configuration: HeftConfiguration): void {
// NOTE: It is best to place all implementation within the async hooks that
// are provided by the session or by the plugin accessors to optimize
session.requestAccessToPluginByName(
Plugin01Name,
(accessor: IPlugin01Accessor) => {
accessor.hooks.afterRun.tapAsync(this.pluginName, async (result: string) => {
this._afterRunPromise = async () => {
// Writes 'some value'
console.log(result);
return result;
})();
});
}
);
session.hooks.run.tapPromise(
this.pluginName,
async (cxt: ITaskContext) => {
if (this._afterRunPromise) {
const result = await this._afterRunPromise;
// Do stuff with the result 'some value'...
} else {
// Tapped hook was never called, do something else
}
}
);
}
}
export default new Plugin02();
The use of this style of inter-plugin hook allows the elimination of arbitrary hooks/stages that Heft currently defines for specific use cases (ex. afterConfigureWebpack
in the current Heft build stage) and allows for the plugins themselves to provide this functionality. Another good use case for this would be splitting linting out of the current version of the TypeScript plugin. The TypeScript plugin could then provide the TS program via a plugin hook to any task that takes an async dependency on it, allowing it to complete execution and allow its dependencies to run without blocking linting. This same hook can be used to re-use the TypeScript program with API Extractor to optimize docs/.d.ts generation.
Heft task events are a way to access built-in Heft utility plugins during task execution. Only a single task plugin or a single task event can be used for each task. Heft task events will tap into the task hooks themselves, so there is no longer any requirement to specify a specific "event" to target (ex. pre-compile
, compile
, etc.). There is also no longer a need to specify an actionId
, since the task provides the identifier.
I've been playing with the 0.49.0-rc.2
release in our local monorepo. After reading through most of the Upgrade Guide, I think reconfiguring our existing rigs and heft setup would be pretty easy, but I'm running into some headaches converting our existing heft plugins to the new model.
In particular, I haven't found a good example of the old-school "lifecycle" plugin using the new Heft objects... As an example, we have one plugin used by most apps that is used to generate "app manifests", and it hooks into the pre-compile step and into the bundle step, so it can tinker with webpack config.
I'm going to paste the entire existing plugin below, let me know if you have any suggestions on the "best way" you think this would be handled in new Heft:
```javascript
export class AppManifestPlugin implements IHeftPlugin
The example provided is actually two plugins that communicate via a file on disk.
The example provided is actually two plugins that communicate via a file on disk.
@dmichon-msft So your suggestion would be that the old single "Plugin" (that hooked into multiple stages of multiple tasks) could be rewritten as 2 plugins in one plugin package, both of which are IHeftTaskPlugins, and then in heft.json I would specify that the precompile one gets attached to typescript task and the dev server one gets attached to webpack task -- and in theory I should somehow still have access to e.g. the "webpackConfigure" hook in the second one.
The configureWebpack
hook is now a part of the Webpack plugin and can be tapped by requesting access to the plugin via the plugin access API. Plugins can only request access to plugins within the same phase.
Looking at your plugin here, it doesn't actually look like you would need 2 plugins though, unless this app manifest file is used by some other process in your build. Assuming it's only used by Webpack, you could make a single plugin that runs during the phase that Webpack runs in, and the data can be passed directly into Webpack instead of loading via a file.
If you do use that file for other parts of your build, yes, you would need to create two separate plugins (though, one plugin package can contain multiple plugins, so they could remain in the same package). Following what you said in your most recent comment, you would create one task plugin to write the file during the phase that the TypeScript task is defined in, and then create a second task plugin used in the phase that Webpack is defined in to tap into the Webpack plugin configureWebpack
hook and provide the data.
One thing to note is that the order of execution for a phase is:
cleanFiles
property for the phase (if requested)taskEvent
, run the eventtaskPlugin
, run the plugin run
hook
Since you're using a plugin access request to tap the configureWebpack
hook, you wouldn't even need the Webpack task to take a taskDependency
on the task containing the plugin that taps the configureWebpack
hook, since it gets tapped during the access request at the beginning of the phase.
This is a proposal for aligning the architecture of Heft to be more compatible with Rush "phased" commands in the interests of improving parallelism, customizability for other tools (esbuild, swc, etc.), reducing Heft aggregate boot time, and optimizing multi-project watching.
Goal 1: Increased Parallelism and Configurability
Current state
Today
heft test
runs a sequence of hardcoded pipeline stages:Where the
Build
stage is further subdivided into hardcoded sub-stages:This limits the ability of Rush to exploit task parallelism to running
heft build --clean
andheft test --no-build
for each project, i.e. if:Then the
test
phase forA
can run concurrently with thebuild
phase forB
.The
heft.json
file provides event actions and plugins to inject build steps at various points within this pipeline, but the pipeline itself is not particularly customizable.When run from the command line, Heft loads a single
HeftConfiguration
object and creates aHeftSession
that corresponds to the command line session.Desired state
In future build rigs that exploit the
isolatedModules
contract to allow transpilation of each and every module from TypeScript -> JavaScript to be an independent operation, we instead have stages more like the following, each of which handles cleaning internally:Custom rigs may require more or fewer stages to accommodate other build steps, and importantly, may alter the dependency relationship between the stages. For example a rig may opt to run its tests on bundled output, and therefore have the "test" stage depend on the "bundle" stage.
Goal 2: Reduce time booting Heft repeatedly in a large Rush monorepo
Current state
The initialization time of a Heft process is currently measured in seconds. In a monorepo with 600 projects, even 1 second of overhead is 10 minutes of CPU-time, since for each operation on each project, Rush boots Heft and its CLI parser in a fresh process.
Desired state
Since Heft is designed to scope state to
HeftSession
objects and closures in plugin taps, it should be possible to reuse a singleHeft
process across multiple operations on multiple projects.Goal 3: Multi-project watch
Current state
Custom watch-mode commands in Rush rely on the underlying command-line script to support efficient incremental execution and are unable to preserve a running process across build passes. Some tools, such as TypeScript or Webpack 5 have support for this model, but others, such as Jest, do not.
Desired state
Using IPC or stdin/stdout, a Heft (or other compatible tool) process can communicate with Rush to receive a notification of changed inputs and to report the result of the command.
Design Spec
Instead of a hardcoded pipeline definition,
heft.json
gains the ability to define a list of stages, their dependencies on other stages, and the event actions and plugins required to implement the functionality for each.Heft.json
HeftServer
The
HeftServer
is a new component in Heft that is responsible for handling requests to execute a specific stage in a specific project. Upon receiving a request it will either locate an existingHeftSession
that corresponds to a prior issuance of that request, or else create a freshHeftSession
, then execute theclean (optional), beforeRun, run, afterRun
hooks in order. The request may also contain an input state object and/or a hint to indicate that the stage will likely be re-executed in the future (for watch mode). When theHeftServer
has finished executing the stage, it will report back to the caller with a list of warnings/errors, the success/failure of the stage, and potentially additional metadata. It may also pipe logs.Heft plugins that need to communicate with other Heft plugins--for example to customize the webpack configuration used by
@rushstack/heft-webpack4-plugin
--should use the Plugin accessor mechanism that has already been implemented.A separate CLI executable will be defined that creates a
HeftServer
and waits for IPC messages.Heft CLI
The Heft CLI process reads
heft.json
, identifies the requested action and usesHeftServer
instances to execute the relevant stages in topological order. If running in--debug
mode or if the stage topology does not contain any parallelism, the Heft CLI will load theHeftServer
in the current process, otherwise it may boot multiple externalHeftServer
processes, or potentially be instructed to connect to an existingHeftServer
process.Edit 2/11/2022:
CLI parsing and custom parameters
In order to support custom parameters defined by plugins, the Heft CLI will introduce a synthetic "CLI Validation" stage at the very beginning of the pipeline for each action. This stage will apply all plugins from all stages used by that action (for optimization, plugins may have a flag in the plugin manifest that indicates that the plugin does not affect the CLI and does not need to be loaded during this stage), then run the CLI parser. No other hooks (clean, pre, run, post) will get run during this synthetic stage. Once the command line has been parsed and validated, Heft will use runtime metadata about which plugins registered each parameter to extract the set of parameters that should be forwarded to each of the defined stages. If multiple plugin instances register the same parameter, as long as the definitions are compatible (exact meaning TBD), Heft will simply forward the parameter to all of them. Each executing stage will receive a scoped command line and run the aggregate parser derived from the plugins for that stage. This avoids global state in the system to keep stage execution compartmentalized and thereby portable.
@rushstack/rush-heft-operation-runner-plugin
The
@rushstack/rush-heft-operation-runner-plugin
is a Rush plugin that provides an implementation of theIOperationRunner
contract (responsible for executing Rush Operations, i.e. a specific phase in a specific Rush project) that executes each Heft stage in the Operation (usually 1) by checking out aHeftServer
instance from a pool maintained by the plugin and issuing an IPC request. The pool will maintain an affinity mapping of the lastHeftServer
used by eachOperation
identity, such that watch mode execution can re-use the sameHeftServer
process for subsequent build passes when the watcher detects changes. The mapping betweenOperation
and Heftstages
should be defined in an extension of therush-project.json
file to prevent Rush from needing to load additional files.