evanw / esbuild

An extremely fast bundler for the web
https://esbuild.github.io/
MIT License
37.94k stars 1.13k forks source link

Support the esbuild plug-in system? #111

Closed ycjcl868 closed 1 year ago

ycjcl868 commented 4 years ago

esbuild is small but complete in every detail, do you have any plans to support the plugin-in system to extend the web development workflows?

evanw commented 4 years ago

Not right now. I may figure out an extensibility story within esbuild after the project matures, but I may also keep esbuild as a relatively lean bundler with only a certain set of built-in features depending on how the project evolves.

The use case of using esbuild as a library was one I hadn't originally considered. It's interesting to see it start to take off and I want to see where it goes. It could be that most users end up using other bundlers and esbuild is just an implementation detail that brings better performance to those bundlers.

I'm still thinking about how I might add extensibility to esbuild in the back of my mind. Obviously it's made more complicated by the fact that it's written in Go. It would be possible to "shell out" to other processes to delegate the task of transforming input files, but that would almost surely be a huge slowdown because JavaScript process startup overhead costs are really high.

One idea I've been thinking about is to have esbuild start up a set number of JavaScript processes (possibly just one) and then stream commands to it over stdin/stdout. That could potentially amortize some of the JavaScript startup cost (loading and JITing packages from disk). It would be more of a pain to debug and might still be surprisingly slow due to all of the serialization overhead and the single-threaded nature of the JavaScript event loop.

Another idea is to turn the esbuild repository into a Go-based "build your own bundler" kit. Then you could write plugins in Go to keep your extensions high-performance. The drawback is that you'd have to build your own bundler, but luckily Go compiles quickly and makes cross-platform builds trivial. That would likely require me to freeze a lot of the Go APIs that are now internal-only, which would prevent me from making major improvements. So this definitely isn't going to happen in the short term since esbuild is still early and under heavy development.

rsms commented 4 years ago

Finding this interesting!

TL;DR: The subprocess approach sounds like a solid idea.

Some thoughts in regards to the conversation above

It would be possible to "shell out" to other processes to delegate the task of transforming input files, but that would almost surely be a huge slowdown because JavaScript process startup overhead costs are really high.

This is a really interesting approach I think, although not the most portable (i.e. running systems like iOS that do not support subprocess creation.) I'd think about the quality-performance problem here in the same way as with Figma plugins:

One idea I've been thinking about is to have esbuild start up a set number of JavaScript processes (possibly just one) and then stream commands to it over stdin/stdout.

In my experience what makes nodejs programs slow is I/O rather than CPU. For example, loading gatsby.js causes an incredible amount of files and directories to be read etc. TypeScript is an example of a good player making the best they can, avoiding runtime imports, but starting tsc is still painfully slow (thus their daemon/server model, which is a solution just like the "subprocess" idea you have!)

Another idea is to turn the esbuild repository into a Go-based "build your own bundler" kit.

Perhaps a nice option for people who are comfortable with Go. Might "pair well" with a subprocess approach for "putting lego blocks together" vs "build your own lego blocks". Ideas about stuff to consider:

Some thoughts on Go plugins

I've worked with plugins in go in the past and it's a little bit complicated (for good reasons.) Since Go is statically compiled and doesn't have a fully dynamic runtime like for example JavaScript it is a little tricky to load code at runtime and really hard to unload code (replace/update.)

Some things to keep in mind:

Some example code from GHP:

Loading plugins: https://github.com/rsms/ghp/blob/8ab5e52dded3ad7443849bf7a51a82e8e7ee2de2/ghp/servlet.go#L65-L71

Add structure to plugins to allow unloading them (without actually unloading their code.) https://github.com/rsms/ghp/blob/8ab5e52dded3ad7443849bf7a51a82e8e7ee2de2/ghp/servlet.go#L107-L119

A plugin: (called "servlet" in this project) https://github.com/rsms/ghp/blob/8ab5e52dded3ad7443849bf7a51a82e8e7ee2de2/example/pub/servlet/servlet.go

zmitry commented 4 years ago

What about something like this https://github.com/hashicorp/go-plugin. Use RPC based plugin system so anyone could write plugin in any language.

evanw commented 4 years ago

@rsms thanks for writing up your thoughts. It was very interesting to read through them, and helpful to learn from your experience. I haven't seen the plugin package used before. I hadn't thought of the Go compiler version problem but that makes them much less appealing.

Maintenance cost of an API that can't change much. I.e. the API provided for "build your own...". Perhaps making it extremely minimal with just two-three functions for pre- and post-processing with a string file list would get you most of the upsides with a low API maintenance cost?

Yes, I was thinking of something extremely minimal. Of course that would come at the cost of performance, which isn't great. I'm not sure if there's a great solution to this.

What about something like this https://github.com/hashicorp/go-plugin. Use RPC based plugin system so anyone could write plugin in any language.

I think something like this is promising. This is basically how esbuild's current API works, except over stdin/stdout. The advantage of this over shelling out is that it lets you amortize the startup overhead of node by keeping it running during the build. You could imagine a more complex API where esbuild has hooks for various points and could call out to node and block that goroutine on the reply. That would let you, say, run the CoffeeScript compiler before esbuild processes the source code of a file.

I plan to explore this direction once esbuild is more feature-complete. My research direction is going to be "assuming you have to use a JavaScript plugin, how fast can you make it". If we can figure that out then that's probably the most helpful form of API for the web development community.

floydspace commented 4 years ago

I would like to vote for this feature.

Currently, I'm migrating some small project from webpack to esbuild and it has graphql schema defined in a file .graphql which is bundled using graphql-tag/loader, of course, I can just change schema definition using the different approach. But It would be nice to have this capability to write esbuild loader without raising a PR in your repository for every single case.

Thank you

chowey commented 4 years ago

If you do implement a plugin system, please consider making it Go-based (or better yet, cross-language as per @zmitry's suggestion).

I think there is a very big opportunity for non-JS-based tooling to transpile JS. As you state in your readme:

I'm hoping that this project serves as an "existence proof" that our JavaScript tooling can be much, much faster.

zmitry commented 4 years ago

I guess for first iteration it would be nice to have just golang api for plugins. I guess if you can run some arbitrary code in golang you could spawn sub process in any language. So even simple golang api would be enough. I like rpc based approach because it would allow to do plugins hot swap or remote plugins and it's much cleaner approach.

evanw commented 4 years ago

I have an update!

While the final plugin API might need a few rewrites to work out the kinks, I think I have an initial approach that I feel pretty good about. It reuses the existing stdio IPC channel that the JavaScript API is already using and extends it to work with plugins. Everything is currently on an unstable branch called plugins.

It's still very much a work in progress but I already have loader plugins working. Here's what a loader plugin currently looks like:

let esbuild = require('esbuild')
let YAML = require('js-yaml')
let util = require('util')
let fs = require('fs')

esbuild.build({
  entryPoints: ['example.js'],
  bundle: true,
  outfile: 'out.js',
  plugins: [
    plugin => {
      plugin.setName('yaml-loader')
      plugin.addLoader({ filter: /\.ya?ml$/ }, async (args) => {
        let source = await util.promisify(fs.readFile)(args.path, 'utf8')
        try {
          let contents = JSON.stringify(YAML.safeLoad(source), null, 2)
          return { contents, loader: 'json' }
        } catch (e) {
          return {
            errors: [{
              text: (e && e.reason) || (e && e.message) || e,
              location: e.mark && {
                line: e.mark.line,
                column: e.mark.column,
                lineText: source.split(/\r\n|\r|\n/g)[e.mark.line],
              },
            }],
          }
        }
      })
    },
  ],
}).catch(() => process.exit(1))

Any errors during loading are integrated into the existing log system so they look native. There is a corresponding Go API that looks very similar. In fact the JavaScript plugin API is implemented on top of the Go plugin API. The API consists of function calls with option objects for both arguments and return values so it should hopefully be easy to extend in a backwards-compatible way.

Loader plugins are given a module path and must come up with the contents for that module. I'm going to work on resolver plugins next which determine how an import path maps to a module path. Resolver plugins and loader plugins go closely together and many plugins are probably going to need both a resolver and a loader. The resolver runs for every import in every file while the loader only runs the first time a given resolved path is encountered.

Something that may be different with this plugin API compared to other bundlers is that every operation has a filter regular expression. Calling out to a JavaScript plugin from Go has overhead and the filter lets you write a faster plugin by avoiding unnecessary plugin calls if it can be determined using the regular expression in Go that the JavaScript plugin isn't needed. I haven't done any performance testing yet so I'm not sure how much slower this is, but it seemed like a good idea to start things off that way.

One weird thing about writing plugins is dealing with two forms of paths: file system paths and "virtual paths" to automatically-generated code. I struggled with the design of this for a while. One approach is to just use absolute paths for everything and make up non-existent directories to put virtual modules in. That leads to concise code but seems error-prone. Another approach I considered was to make every path into a tuple of a string and a type. That's how paths are represented internally but seemed too heavy for writing short plugins. I'm currently strongly considering marking virtual paths with a single null byte at the front like Rollup convention. Null bytes make the path invalid and the code for manipulating them is more concise than tuple objects.

I thought it'd be a good idea to post an update now even though it's not quite ready to try out, since getting loaders working seemed like an important milestone.

evanw commented 4 years ago

After more thought, I'm no longer thinking of taking the approach Rollup does with virtual paths using a null byte prefix. Instead I'm going back to the "paths are a tuple" model described above. In the current form, each path has an optional namespace field that defaults to file. By default loaders only see paths in the file namespace, but a loader can be configured to load paths from another namespace instead. This should allow for a clean separation between plugins and doesn't seem as verbose as I thought it would in practice.

Also, I just got resolver plugins working! This lets you intercept certain paths and prevent the default resolver from running. Here's an example of a plugin that uses this to load URL imports from the network:

// import value from 'https://www.google.com'
let https = require('https')
let http = require('http')

let httpLoader = plugin => {
  plugin.setName('http')
  plugin.addResolver({ filter: /^https?:\/\// }, args => {
    return { path: args.path, namespace: 'http' }
  })
  plugin.addLoader({ filter: /^https?:\/\//, namespace: 'http' }, async (args) => {
    let contents = await new Promise((resolve, reject) => {
      let lib = args.path.startsWith('https') ? https : http
      lib.get(args.path, res => {
        let chunks = []
        res.on('data', chunk => chunks.push(chunk))
        res.on('end', () => resolve(Buffer.concat(chunks)))
      }).on('error', reject)
    })
    return { contents, loader: 'text' }
  })
}

The resolver moves the paths to the http namespace so the default resolver ignores them. This means they are "virtual modules" because they don't exist on disk.

Plugins can generate arbitrarily many virtual modules by importing new paths and then intercepting them. Here's a plugin I made to test this feature that implements the Fibonacci sequence using modules:

// import value from 'fib(10)'
let fibonacciLoader = plugin => {
  plugin.setName('fibonacci')
  plugin.addResolver({ filter: /^fib\((\d+)\)/ }, args => {
    return { path: args.path, namespace: 'fibonacci' }
  })
  plugin.addLoader({ filter: /^fib\((\d+)\)/, namespace: 'fibonacci' }, args => {
    let match = /^fib\((\d+)\)/.exec(args.path), n = +match[1]
    let contents = n < 2 ? `export default ${n}` : `
      import n1 from 'fib(${n - 1}) ${args.path}'
      import n2 from 'fib(${n - 2}) ${args.path}'
      export default n1 + n2`
    return { contents }
  })
}

Importing from the path fib(N) generates fib(N) modules that are then all bundled into one.

tooolbox commented 4 years ago

The examples are impressive, and the fib(N) is both amusing and a good illustration of the capabilities.

I'm wondering if you could give a little more context regarding usage? From trying to sort through the plugins branch a little, it seems like you're mainly using esbuild's JS API and passing plugins to the transform call, but I'm curious how it would work if you were using direct command-line, or trying to write a plugin directly with Go.

evanw commented 4 years ago

Yes, good point.

The plugin API is intended to be used with the esbuild API. People have already been creating simple JavaScript "build script" files that just call the esbuild API and exit. This is a more convenient way of specifying a lot of arguments to esbuild than a long command line in a package.json script. From there you can use plugins by just passing an additional plugins array. Here's an example:

const { build } = require('esbuild')

let envPlugin = plugin => {
  plugin.setName('env-plugin')
  plugin.addResolver({ filter: /^env$/ }, args => {
    return { path: 'env', namespace: 'env-plugin' }
  })
  plugin.addLoader({ filter: /^env$/, namespace: 'env-plugin' }, args => {
    return { contents: JSON.stringify(process.env), loader: 'json' }
  })
}

build({
  entryPoints: ['entry.js'],
  bundle: true,
  outfile: 'out.js',
  plugins: [
    envPlugin,
  ],
}).catch(() => process.exit(1))

In reality I assume most of these plugins will be in third-party packages maintained by the community, so you would likely import the plugin using require() instead of pasting it inline like this.

The Go API is extremely similar. Here's the same example using the Go API instead:

package main

import (
  "encoding/json"
  "io/ioutil"
  "log"
  "os"
  "path/filepath"
  "strings"

  "github.com/evanw/esbuild/pkg/api"
)

func main() {
  result := api.Build(api.BuildOptions{
    EntryPoints: []string{"entry.js"},
    Bundle:      true,
    Write:       true,
    LogLevel:    api.LogLevelInfo,
    Outfile:     "out.js",
    Plugins: []func(api.Plugin){
      func(plugin api.Plugin) {
        plugin.SetName("env-plugin")
        plugin.AddResolver(api.ResolverOptions{Filter: "^env$"},
          func(args api.ResolverArgs) (api.ResolverResult, error) {
            return api.ResolverResult{Path: "env", Namespace: "env-plugin"}, nil
          })
        plugin.AddLoader(api.LoaderOptions{Filter: "^env$", Namespace: "env-plugin"},
          func(args api.LoaderArgs) (api.LoaderResult, error) {
            mapping := make(map[string]string)
            for _, item := range os.Environ() {
              if equals := strings.IndexByte(item, '='); equals != -1 {
                mapping[item[:equals]] = item[equals+1:]
              }
            }
            bytes, _ := json.Marshal(mappings)
            contents := string(bytes)
            return api.LoaderResult{Contents: &contents, Loader: api.LoaderJSON}, nil
          })
      },
    },
  })

  if len(result.Errors) > 0 {
    os.Exit(1)
  }
}

Plugins aren't designed to be used on the command line. This is the first case of the full API not being available from the command line, but given that plugins are language-specific I think it makes sense to require you to use the language-specific esbuild API to use plugins.

LarsDenBakker commented 4 years ago

It would be useful to have a transform hook where you gain access to the generated AST, so that you can do fast file transformations in Go.

tooolbox commented 4 years ago

Excellent examples, thanks!

Plugins aren't designed to be used on the command line. This is the first case of the full API not being available from the command line, but given that plugins are language-specific I think it makes sense to require you to use the language-specific esbuild API to use plugins.

Okay got it, yeah this seems wise.

evanw commented 4 years ago

It would be useful to have a transform hook where you gain access to the generated AST, so that you can do fast file transformations in Go.

I totally understand why this would be useful, but I don't want to expose the AST in its current form. It's designed for speed, not ease of use, and there are lots of subtle invariants that need to be upheld (e.g. scope tree, symbol use counts, cross-part dependency tracking, import and export maps, ES6 import/export syntax flags, ordering of lowering and mangling operations, etc.). Exposing this internal AST to plugins would be a good way to destabilize esbuild and cause silent and hard-to-debug correctness issues with the generated code.

I'm also trying to keep the quality of esbuild high, both in terms of the user experience and the developer experience. I don't want to expose the internal AST too early and then be stuck with that interface, since I don't think it's the right interface.

Figuring out a good interface for the AST that is easy to use, doesn't slow things down too much, and hard to cause code generation bugs with would be a good project to explore. But this is a big undertaking and I don't think now is the right part in the timeline of this project to do this. It also makes a lot of other upcoming features harder (e.g. code splitting, other file types such as HTML and CSS) because it freezes the AST interface when it might need to change.

For now, it's best to either serialize the AST to a string before passing it to esbuild or use other tools if you need to do AST manipulation.

chowey commented 4 years ago

CSS extraction was actually pretty simple in Go:

package main

import (
    "bytes"
    "io"
    "os"

    "github.com/evanw/esbuild/pkg/api"
)

var cssExport = "export default {};\n"

// CSSExtractor will accumulate CSS into a buffer.
type CSSExtractor struct {
    bytes.Buffer
}

// Plugin can be used in api.BuildOptions.
func (ex *CSSExtractor) Plugin(plugin api.Plugin) {
    plugin.SetName("css-extractor")
    plugin.AddLoader(
        api.LoaderOptions{Filter: `\.css$`},
        func(args api.LoaderArgs) (res api.LoaderResult, err error) {
            f, err := os.Open(args.Path)
            if err != nil {
                return res, err
            }
            defer f.Close()
            if _, err := io.Copy(ex, f); err != nil {
                return res, err
            }

            // CSS is an empty export.
            res.Loader = api.LoaderJS
            res.Contents = &cssExport
            return res, nil
        },
    )
}

This works for me, since I just want to write my CSS to a file.

jakajancar commented 4 years ago

Is my understanding correct that this will then require a “wrapper” around esbuild in either go or node (or another language that implements the protocol node is using), and plugins will have to be written it that language?

I.e. you wont be able to run esbuild --plugin download --plugin somethingelse, and have them be written in whatever?

In other words more than starting an ecosystem of plugins for esbuild, likely a wrapper tool will emerge in both languages and plugin ecosysytems for the wrappers?

evanw commented 4 years ago

I'm expecting all serious usage of esbuild to use the API anyway because specifying a long list of options on the command line isn't very maintainable (e.g. don't get nice diffs or git blame). You can easily do this without a separate "wrapper" package just by calling esbuild's JavaScript API from a file with a few lines of code:

const { build } = require('esbuild')

build({
  entryPoints: ['./src/main.ts'],
  outfile: './dist/main.js',
  minify: true,
  bundle: true,
}).catch(() => process.exit(1))

From that point, adding plugins is just adding another property to the build call. I'm sure some people will create fancy wrappers but a wrapper isn't necessary to use plugins.

I'm also expecting that the large majority of esbuild plugins will be JavaScript plugins. Virtually all of the plugins in the current bundler community are written in JavaScript and people likely won't rewrite them when porting them to esbuild. So my design for plugins is primarily oriented around JavaScript and its ecosystem, not around Go. In that world most people wouldn't need a wrapper.

As far as non-JavaScript languages, that stuff can get extremely custom and I think exposing a general API like the current Go API is better than trying to guess up front what people would want in a native language wrapper and hard-coding that into esbuild. You should be able to use the Go API to do whatever custom native language bindings you want (local sockets, child processes, RPC with a server, etc.) without any performance overhead over what esbuild would have done itself anyway.

jakajancar commented 4 years ago

You can easily do this without a separate "wrapper" package just by calling esbuild's JavaScript API from a file with a few lines of code

Which is what I meant by "wrapper" :)

Consider this: there will be a bunch of "esbuild plugins" built in Node. And if you don't use Node, maybe because you use Deno instead, there will be a separate ecosystem of plugins there that will likely emerge in Deno, distinct from Node's.

It's almost less about plugins for esbuild, than it is about making esbuild pluggable/embeddable itself, with the option of hooks. And it doesn't seem like a JavaScript API, but a language-agnostic (and private?) binary API, and then a Node binding/API for it.

Anyways, makes sense, just the wording got me confused a bit.

evanw commented 4 years ago

Which is what I meant by "wrapper" :)

I see. I thought you meant that it would be complex enough to require an additional wrapper package with a significant amount of code.

And it doesn't seem like a JavaScript API, but a language-agnostic (and private?) binary API, and then a Node binding/API for it.

Yes the binary IPC protocol I'm using is currently private. While it is language-agnostic, it's designed with the JavaScript host in mind. If you need to integrate esbuild with another native binary, I personally think the Go API is much more ergonomic and useful than the binary protocol. For example, the binary protocol serializes everything over a single stream which can limit multi-threaded performance. This isn't a problem for a JavaScript host since JavaScript is single-threaded already, but would be a problem if you're trying to reach maximum performance with another multi-threaded native language. The Go API doesn't have this problem because each plugin invocation is in a separate goroutine.

baryla commented 4 years ago

This is looking really good @evanw! I've been waiting for this feature for a while.

I like the way this going but I have 1 suggestion regarding the filter inside the resolvers and loaders. I understand that regex is super quick and esbuild's aim is to remain lightning fast but have you considered also allowing a function to be passed with args (same as the callback) in case there was a case that required a little bit more logic? Regex by nature is difficult to read and debug so for some people, a function may be an "easier" option.

evanw commented 4 years ago

have you considered also allowing a function to be passed with args (same as the callback) in case there was a case that required a little bit more logic?

I'm not sure what you mean. Both addResolver and addLoader take a function as a callback. The filter regex is just there to speed things up, but you can always make it .* if you need to match everything. The function can return null or undefined to pass control on to the next resolver or loader, so the function can also serve as a filter.

That said, I hope people won't do that when they don't need to. A regex to pre-filter for the relevant file extension is short and (I think) still pretty readable. Using .* instead will slow things down unnecessarily. But sometimes you need to match everything so the ability is there if you want to use it.

baryla commented 4 years ago

@evanw

I'm not sure what you mean. Both addResolver and addLoader take a function as a callback. The filter regex is just there to speed things up, but you can always make it .* if you need to match everything. The function can return null or undefined to pass control on to the next resolver or loader, so the function can also serve as a filter.

That said, I hope people won't do that when they don't need to. A regex to pre-filter for the relevant file extension is short and (I think) still pretty readable. Using .* instead will slow things down unnecessarily. But sometimes you need to match everything so the ability is there if you want to use it.

Maybe I slightly misunderstood the goal of a filter but it does now make sense. I just thought that the filter could also cater for some more "complicated" filtering if it needed to such as:

const isAllowed = path => {
  if (!path.endsWith('.css)) {
    return false;
  }

  return ['app', 'component'].some(name => path.startsWith(name));
}

const myPlugin = plugin => {
  plugin.resolver({ filter: isAllowed }, args => {
    // logic to resolve file
  })
}

which would still avoid unnecessary plugin calls if I'm not mistaken?

evanw commented 4 years ago

which would still avoid unnecessary plugin calls if I'm not mistaken?

The goal is to minimize the number of calls from Go into JavaScript since crossing this boundary is pretty slow. If you already have to call into JavaScript, you might as well run the entire plugin. It would be even slower to call into JavaScript twice, once for the filter function and once for the actual resolver.

While that plugin would best be written like this:

const myPlugin = plugin => {
  plugin.addResolver({ filter: /^(app|component).*\.css$/ }, args => {
    // logic to resolve file
  })
}

It could also be written like this:

const isAllowed = path => {
  if (!path.endsWith('.css')) {
    return false;
  }

  return ['app', 'component'].some(name => path.startsWith(name));
}

const myPlugin = plugin => {
  plugin.addResolver({ filter: /.*/ }, args => {
    if (!isAllowed(args.path))
      return;

    // logic to resolve file
  })
}
ggoodman commented 4 years ago

@evanw any thoughts on adding this in a stable release (even if it is clearly marked as unstable unstable_plugins: comes to mind)?

What kind of decision decisions are you still evaluating?

evanw commented 4 years ago

What kind of decision decisions are you still evaluating?

I'm also excited about plugins and looking forward to them being released. There are a few reasons why I haven't released the plugin API yet.

One is that code splitting (issue #16) is still very much a work in progress and I feel like I should follow through with that first. It's a pretty foundational feature, people are starting to depend on it, and it deserves to be in a stable spot. I don't want to start too many things at once without finishing what I've started. I also want to make sure I can give the plugin API my full attention when it's released and people start using it.

Another thing I'd like to think more about is how other file types (e.g. HTML and CSS) interact with plugins. I've made progress on thinking through this but have had to put that aside to work on some recent correctness issues around code splitting and source maps. I plan to get back to thinking through alternate file types and plugins after that's in a good spot.

I'm still pushing toward the release of plugins in the meantime. Besides iterating on the API, I have also recently landed the nested source maps feature (issue #211) which is somewhat of a prerequisite for plugins. Source maps for non-JavaScript language plugins won't work without it. I landed it independently because it's a useful feature by itself, but I consider it to have been blocking the plugin API release. This also gives it some time to stabilize first so it's ready when plugins happen.

any thoughts on adding this in a stable release (even if it is clearly marked as unstable unstable_plugins: comes to mind)?

I have considered exposing an unstable plugin API as well but I don't think that solves much. People will still start to depend on it regardless and if the ultimate API ends up undergoing major changes, upgrading to the new plugin API will still be just as much work. So I'd like to keep plugins on a branch for now.

matthewmueller commented 4 years ago

For what it's worth, I've been using the plugins branch the last week or so and it already works great! Two areas that are a bit awkward but totally workable:

  1. It would be nice to be able to feed assets back into the pipeline. I think this is what you mean by what to do about HTML and CSS. This would allow additional asset transformations that take advantage of ESBuild's parallelism, cache, etc. I believe this feature is similar to rollup's this.emit functionality inside its plugins. You can workaround this by discovering these assets in the plugin and passing them to a separate asset pipeline.

  2. Sometimes the same entrypoints are built for different environments. For example, if you're building pages for both Node and the browser. The various build options like Platform, Format and Plugins will depend on the environment, so you end up needing two instances of ESBuild. Not a big deal, just a possible improvement as you're thinking about the API.

Happy to share additional context or code if helpful!

stephen commented 4 years ago

Hi, just wanted to add another datapoint about our experience trying out esbuild. We experimented with building a dev server for our fairly large frontend app (5k+ files, js/ts/css/graphql). We wrote a .graphql loader plugin and .css loader plugin in the go api and some livereload boilerplate:

We ended up doing something similar to https://github.com/evanw/esbuild/issues/111#issuecomment-636408869, emitting a separate fake filesystem for the dev server. The js emit becomes a small stub that adds a new <link> tag to the document head. I imagine if we wanted to emit/bundle css files for a production build later, we'd have to do it ourselves. This wasn't a problem for the .graphql files, since they don't have a separate compiled output.

As a side note, we ended up implementing css compilation (our css assumes compilation with some postcss plugins) by spinning up a small node worker pool and doing http requests against it.

Our frontend uses webpack-graphql-loader to inline .graphql documents as strings. In our esbuild port, we end up doing additional import resolution, caching and parallelization within the plugin. We didn't end up doing sourcemaps, but it seems like we could with https://github.com/evanw/esbuild/commit/23f0884de58781d04b6300cccec732f4e6ac8eac.

Notably, our css compilation would have suffered the same problem, but postcss is doing @import resolution anyway.

zmitry commented 4 years ago

@evanw Any updates on plugins side?

evanw commented 4 years ago

No updates at the moment. I have some stuff going on in my personal life that's taking priority right now (relocating some family members). My focus will be back on esbuild after that is done.

tinchoz49 commented 4 years ago

Don't worry :) thank you for working on this.

intrnl commented 4 years ago

Coming from the JSX automatic runtime thing, can the transform API use plugins or is it something that's only for builds?

littledivy commented 3 years ago

Importing from URL was pretty easy using the Go API at plugins branch :smile:

func URLLoader(plugin api.Plugin) {
    plugin.SetName("url-loader")
    plugin.AddResolver(api.ResolverOptions{Filter: "^https?://"},
        func(args api.ResolverArgs) (api.ResolverResult, error) {
            fmt.Println("Downloading ", args.Path)
            // Get the data
            resp, _ := http.Get(args.Path)
            fileName := buildFileName(args.Path)
            defer resp.Body.Close()
            file, err := ioutil.TempFile("", fileName)
            if err != nil {
                log.Fatal(err)
            }
            io.Copy(file, resp.Body)

            defer file.Close()
            fmt.Println("Downloaded ", file.Name())
            return api.ResolverResult{Path: file.Name(), Namespace: "url-loader"}, nil
        })
    plugin.AddLoader(api.LoaderOptions{Filter: "^", Namespace: "url-loader"},
        func(args api.LoaderArgs) (api.LoaderResult, error) {
            fmt.Println("Loading ", args.Path)
            dat, _ := ioutil.ReadFile(args.Path)
            contents := string(dat)
            return api.LoaderResult{Contents: &contents, Loader: api.LoaderTS}, nil
        })
}

Works quite well for a runtime I'm building Done

Cheers @evanw :raised_hands:

ije commented 3 years ago

This is awesome! i just tried, it works fine. thanks @evanw ! My question is whether is possible to rewrite the external module path like:

api.Build(api.BuildOptions{
    EntryPoints: []string{"entry.js"},
    Bundle:      true,
    Write:       true,
    LogLevel:    api.LogLevelInfo,
    Plugins: []func(api.Plugin){
        func(plugin api.Plugin) {
            plugin.SetName("rewrite-external-plugin")
            plugin.AddResolver(
                api.ResolverOptions{Filter: "^(react|react-dom)$"},
                func(args api.ResolverArgs) (api.ResolverResult, error) {
                    newPath := fmt.Sprintf("https://esm.sh/%s", args.Path)
                    return api.ResolverResult{Path: args.Path, AsPath: newPath, External: true, Namespace: "rewrite-external"}, nil
                },
            )
            plugin.AddLoader(
                api.LoaderOptions{Filter: "^(react|react-dom)$", Namespace: "rewrite-external"},
                func(args api.LoaderArgs) (api.LoaderResult, error) {
                    contents := ""
                    return api.LoaderResult{Contents: &contents, Loader: api.LoaderJSON}, nil
                },
            )
        },
    },
})
evanw commented 3 years ago

My problem is whether is possible to rewrite the external module path

Thanks for pointing this out. This was an oversight on my part. I updated the plugins branch and this should now be possible. You would specify Path: newPath and you wouldn't need a loader to do this (you only need a resolver).

ije commented 3 years ago

cool, thanks for the great work!

zmitry commented 3 years ago

I've implemented plugin which allows you to import svgs as react components. It's not battle tested yet. https://github.com/zmitry/esbuild-svgr

zmitry commented 3 years ago

I've also created setup with create-react-app which allows you to use esbuild with webpack dev server and regular cra setup https://github.com/zmitry/esbuild-cra. It's hacky but it kind of works you just need to install golang and you are ready to go. I do not recommend to use it for production. It's only POC where you can play and decide if you need esbuild and it worth switching.

st-schneider commented 3 years ago

Is there a way to use https://github.com/atlassian-labs/compiled via a plugin for css extraction?

zmitry commented 3 years ago

@evanw I want to add some extra feature requests to plugins api. First use case would be ability to write plugin for html. This requires to have ability to get get name of the processed file inside html. I want to get path to the index js in the output folder or its content in case I want to inline it. Also I want all the subtitunion to work inside html, sometimes I want to replace some variables like in my use case.

<html>
    <meta http-equiv="Content-Security-Policy" content="{CSP_POLICY}" />
    <link rel="icon" type="image/png" sizes="32x32" href="{PUBLIC_URL}/favicon-32x32.png" />
    <link rel="icon" type="image/png" sizes="16x16" href="{PUBLIC_URL}/favicon-16x16.png" />
<script src="./src/indexj.s"></script>
</html>

Another use case which I want is to have ability to add some arbitrary content to the processing pipeline. For example I want to implement simple css in js solution using esbuild. The idea that I want to treat all css("string") as css modules. So what I want to do is to get content of css("") and add it to processing pipeline and replace css call with object with classnames.

const styles = css(`
  .red {
   background: red;
  }
`)
// =>

const styles= { red: 'red-classname' }
evanw commented 3 years ago

I want to get path to the index js in the output folder or it's contents in case I want to inline it.

Thanks for raising this desire. This isn't currently possible outside of hacky nested build approaches. I'll have to think about this. It probably means there needs to be a way for a plugin to add a new entry point during the bundling process, which should be possible.

The idea that I want to treat all css("string") as css modules.

This should be possible already I think. It sounds like the idea is to replace this:

const styles = css(`...code...`)

with this:

import 'some-path.css'
const styles = { red: 'red-classname' }

where some-path.css is whatever path the loader needs to be able to associate the automatically-generated import statement with that particular CSS string. One solution could be to use some prefix that the loader recognizes and then encode the entire contents of the string using URL encoding, but some form of incrementing identifier combined with a mapping in the plugin should also work.

zmitry commented 3 years ago

Could you elaborate on the idea about css strings within js? I didn't quite get how to implement it. For instance if I have the code like following. In this case I need to resolve classes for first chunk and only then parse next chunk.

const stylesA = css("code")
const stylesB = css(`.${stylesA.button}:hover{ background red }`)
evanw commented 3 years ago

Sorry, I'm not sure I understand what you're trying to do anymore. It looks like the CSS can now depend on run-time values because of the use of `${button}`. In that case esbuild won't be able to process the CSS at bundle time. However, it should still be possible to write a function called css that injects <style> elements at run-time to make that code snippet work.

zmitry commented 3 years ago

Sorry, I did mistake. So the idea is that css values can depend on output of another css modules. (see my previous snippet).

zandaqo commented 3 years ago

I'm curious, is anyone looking into a loader for Node.js native addons? This would allow a fast/simple bundler for Node.js , a replacement for ncc, for example, that relies on webpack and is comparatively slow, although admittedly it does a fair bit more than just bundling native addons.

bep commented 3 years ago

Just wanted to chime and say that I have a Hugo branch testing the plugins branch out, and it is a perfect fit, works great. I'm only using the resolver (for now), but given the nature of our module system (a union filesystem, no common root), being able to do custom import resolving has been the missing piece in all of this. Working with this in VS Code in a multi project setup feels almost a little magical given its speed and all. Great job.

bep commented 3 years ago

@evanw a quick yes/no question:

evanw commented 3 years ago

Is it fair to assume that that branch will eventually land in the main branch with approximately the current feature set intact?

Yup. I'm actually working on this right now, it just doesn't look like it. As part of the plugins release I want esbuild to have a real website with comprehensive documentation. It's been a lot of work but it's actually mostly done. However, this work is currently being done in a private repo. I hope to get the esbuild website up and ship plugins in the next week or two.

I'm not totally certain about the current plugin API. It's the first time I'm writing one of these and I'm not a heavy user of other bundler plugin APIs, so I suspect there are things missing that could fundamentally alter the design. For example, plugins don't really compose right now. There's also no way of invoking esbuild's default behavior from within a plugin. So I may need to change the design in a backwards-compatible way before version 1.0.0 of esbuild.

However, the current direction is promising and seems like it's been able to solve most of the problems thrown at it. It has also withstood the first round of people testing it out (thank you very much everyone who tried it!) so I think now is a good point to release it for wider feedback. I also realize that the JavaScript API is significantly harder than the Go API to try out while it's still on a branch, so releasing it is also important for getting wider feedback about integration with the JavaScript ecosystem.

eigilsagafos commented 3 years ago

Thanks @evanw for the update. I have been testing it this week with Yarn 2 (pnp). Looking great, and much more performant than I had expected it to be! I posted in another thread that the biggest issue I'm having currently is a way to ignore packages already marked as external. Now I have to implement that as a part of the plugin. Maybe you could provide a "ignoreExternal" flag as part of the addResolver settings or provide external true/false as part of the args provided to the function?

evanw commented 3 years ago

Maybe you could provide a "ignoreExternal" flag as part of the addResolver settings or provide external true/false as part of the args provided to the function?

Yeah thanks for that feedback. I thought your suggestion on the other thread of an option to include externals was interesting, since then they would be excluded by default. It sounds like excluding them might be the right default behavior. I can't think of a use case for including them off the top of my head.

Actually maybe the API could just switch to always excluding the externals if there's no current use case for including them. This kind of makes sense to me in that the user's configuration should override the plugin's configuration since the user can change their configuration but they can't necessarily change the plugin.

It does mean that esbuild would have to always resolve all paths itself though instead of only resolving them when no plugin resolves the path so there's a performance hit, although I'm guessing it's likely a minimal performance hit.

lukeed commented 3 years ago

For example, plugins don't really compose right now.

Would it be possible to clarify this? Maybe I'm being too short-sighted, but I think the main need here is that the transformed output from one loader could invoke another load. For example, a HTML loader that parses <script> tags for JS entries, which can/may import other .css (css), .svg (baseurl), and .svelte (plugin) files. And the same cycle repeats for the .svelte contents.

But if "composing plugins" means extending a plugin, or calling one directly from another, I'm not sure how necessary that really is.

the JavaScript API is significantly harder than the Go API to try out while it's still on a branch

😅 Do you plan to release esbuild@next (or similar) on npm at some point for testing? Would be a nice, non-binding way to gather additional feedback