Closed FredKSchott closed 4 years ago
Hey @FredKSchott !
We should definitely be able to support this, it's been on the back of my mind since the start.
We need to figure out a few things for this. Ideally we can do a similar thing to how build tools integrate with Karma, and compose the two server together. Snowpack can serve and pre-process the files, but users could still use some of our plugins.
The esbuild and rollup plugins probably don't make much sense, but the import maps plugin might be useful for mocking es modules and perhaps the legacy plugin for testing on older browsers. We will likely have more plugins in the future. It also doesn't completely invalidate our documentation, otherwise it might be confusing to users.
Without any extra flags/configuration/plugins, our dev server does nothing other than serve static files and communicate with the test runner part that runs in the browser. The serving static file part can easily be swapped out without breaking other functionality.
Does the snowpack server handle transforming inline scripts in HTML? This is used quite a bit in the test runner, for example with HTML tests.
Some options to start the discussion:
I think this is a great approach from a user configuration perspective. We have a plugin system which can be used to serve files:
Plugin:
import { getRequestFilePath } from '@web/dev-server-core';
function snowpackPlugin() {
let config;
return {
async serverStart(args) {
({ config } = args);
},
serve(context) {
const filePath = getRequestFilePath(context, config.rootDir);
return serveUsingSnowpack(filePath);
}
}
}
User config:
export default {
plugins: [snowpackPlugin()]
}
Our plugin API is very similar to rollup's plugin API. There is also a transform
hook so that you don't need to look up the file from the file system - but I don't think that will be sufficient for your use case.
You can read more about the plugins here: https://modern-web.dev/docs/dev-server/plugins/writing-plugins/
If your server is Koa or Express, it could also be installed as middleware. For express there is an adapter you can use.
export default {
middleware: [snowpackMiddleware()]
}
Another approach is to proxy requests to another server. The downsides here are that we're hogging extra ports, it's harder to get the configuration to users and there might be some overhead involved with proxying requests and running two servers:
export default {
fileServer: 'https://localhost:9000'
}
Awesome! Some thoughts inline:
Does the snowpack server handle transforming inline scripts in HTML? This is used quite a bit in the test runner, for example with HTML tests.
We do transform imports from bare-module specifies to paths inside of HTML, but other than that we just support in-line JS without transform.
Snowpack as a plugin
Building files is only part of the story for us. The other part is resolving imports, and making sure that files are served out of the location that you’d expect. Our dev server can guarentee this, but not sure how we’d get the same guarantees if we don’t own the server.
One thing that could be interesting would be building your site to disk with snowpack, and then testing against that final static built site on disk. A Snowpack-built site would be guaranteed to be just JS, CSS, and HTML with all imports already resolved to full paths and all files built. I’d imagine no special config needed to run the test runner.
That also gets the added benefit of testing against files on disk, so stack traces could be explored on disk, devs can add console logs, etc.
Snowpack as middleware
We don’t use express or Koa internally, but we could create a really simple middleware that we could then export. All files would be huimports would be resolved to valid URL specifiers in our response, is that okay?
Snowpack as separate server
Sounds like the other two options are worth investigating first, but good to know that this is a possibility if those don’t pan out.
I’ll create an issue to explore this more from our end. @stramel this could be right up your alley if you’re interested! :)
As a clarification, we don't crawl the dependency graphs. We just serve the test files and respond to requests from the browser. Each request, including the initial test HTML and the test files, go through the serve
hook in a plugin. So you could intercept the first test file and do any kind of logic you need to it, just make sure you produce an es module which imports it's dependencies. Those imports would reach your plugin again and you can do further logic for each file as needed.
You could also bundle it all into one file if that's something you need. That's what the webpack and rollup plugins in karma did for example. Though I think it would be bad for performance.
Is there some high level overview of what the snowpack dev server does? Is it still based on buildless development, just now with added transforms for CSS and such if people need it? Or is it now a classic development bundling workflow, like webpack dev server?
Is it still based on buildless development, just now with added transforms for CSS and such if people need it
☝️ this!
Okay, sounds like I just need to play around a bit more with the test runner, and see how we can fit together. Will update this thread when I get to it. Thanks for the details!
Okay, I got pretty far with this test runner config:
const got = require('got');
module.exports = {
plugins: [
{
name: 'my-plugin',
async serve(context) {
// we do some mapping inside of Snowpack, which we'd need to handle better here
const url = context.request.url.replace('/src/', '/_dist_/').replace('.jsx', '.js');
// proxy our requests to the Snowpack dev server
return {body: (await got.get(`http://localhost:8080${url}`)).body, type: 'js'};
},
serverStart() { /* startup Snowpack dev server in "test" mode */ }
},
],
};
I think this is a promising enough path to keep going down, with the goal of shipping a Snowpack plugin for @web/test-runner. Still a couple more things for us to tackle though, mainly:
/src/
, but really we want to actually import it out of /_dist_/
. Is there a better way to transform the top-level import than how I'm doing it in the snippet above?(I just re-read my last post and saw that I also wanted to explore "middleware", which may be more fitting given that most of the docs on middleware are focused mainly on proxying requests. Seems like it would be a similar idea, just hooking into koa directly instead of via this custom serve()
hook. Any recommendation between the two?
If you will create a separate dev server entirely, using a proxy middleware would indeed be simpler. The serve
hook itself is also a middleware, it's there to make it easy to hook into JS APIs and reduce boilerplate.
In a typical setup the request flow look like this:
The browser is opened on the root URL /
. This serves the testindex.html
<html>
<body>
<script type="module" src="/path-to-test-framework.js"></script>
</body>
</html>
The configured test framework module is loaded, this fetches a config from the test runner which tells it which test file to run.
// simplified
const config = await fetch('/wtr/config');
await import(config.testFile);
await fetch('/wtr/finished');
/test/foo.test.js
import { expect } from '/node_modules/@esm-bundle/chai/esm/chai.js';
describe('my test', () => {
it('works', () => {
expect(true).to.be.false;
});
});
The HTML page would be served by the test runner, but all the other files exist on disk and can be served by anything that can serve static files. In the process any imports can be rewritten as needed, so that you could rewrite any import that goes to /src/
to /_dist_/
.
Perhaps it's easier to get it working for a buildless snowpack project first (ex. CSS imports etc.) and work from there?
I threw together a quick example here: https://github.com/LarsDenBakker/wtr-custom-server
This is a middleware that transforms CSS files into JS modules: https://github.com/LarsDenBakker/wtr-custom-server/blob/master/css-middleware.mjs
That's a really simple transform, but it illustrates the point of handing off transformations to another process.
Thanks @LarsDenBakker, I ended up taking another stab at this to connect the build file pipeline directly into @web/test-runner, and got pretty far there as well. You can see the latest here (warning, still very hacky): https://github.com/pikapkg/snowpack/compare/wip-esmpkg...wip-web-test-runner#diff-052882531b0b6475c9dc35f17d27925b
What's nice about this is that it's basically the same exact work as a Node.js ESM loader hook, so we could potentially support both @web/test-runner and a Node.js test runner (/w JSDOM) using the same method, and leave it up to the user whichever they prefer.
At this point we have two good options, and there's nothing really needed from the test runner. The hooks system so far has had everything we need. Thanks for taking the time to focus on a good plugin ecosystem from the jump!
(one bit of other feedback, not critical but definitely top of mind for us) One of the reasons we want to move away from Jest is to move away from the 400-800 dependencies that that ecosystem ends up adding to any one project. I put @web/test-runner into this dependency visualizer, and saw that @web/test-runner comes with 346 deps: https://npm.anvaka.com/#/view/2d/%2540web%252Ftest-runner
Seems like there might be some low-hanging fruit there for you. Mocha especially seems to be responsible for ~125 of them. If you're only using Mocha for it's browser test running code, most of those are going unused.
I think it's a good approach to let the user choose.
We've really tried to keep the dependencies down, making certain features opt-in via plugins. But with node it's easy to add a single dependency which adds a lot. Will review this again.
Mocha should actually be a dev dependency, because we bundle it along with the WTR glue code. This way people can use it without the node-resolve flag. Will look into it!
Mocha is now a dev dependency, this cuts down a lot already.
Congrats on the launch!
We use Jest in our Snowpack applications but kind of hate the experience we're delivering with it. It's a ton of extra config/tooling required to essentially re-create what we're already doing for users on Snowpack. If we could remove Jest, we'd get rid of something like 600-800 packages from the final install of a Create Snowpack App template.
I'd love to try out the modern web test runner, but would want to use it to load code directly from the Snowpack dev server so that files are built using the build pipeline Snowpack users have already defined.
Have you given any thought to how that might be done? Is it supported today, and if not how much work would it take?