Closed kentcdodds closed 2 years ago
Hey, code looks good; my only concern is performance. Has any testing been done on this?
Just did some testing here:
const fs = require("fs-extra");
async function oneFile() {
console.time("one file");
await fs.readFile("./files/01.css");
console.timeEnd("one file");
}
async function hundredFiles() {
console.time("100 files");
for (let i = 0; i < 100; i++) {
await fs.readFile(`./files/01 copy ${i}.css`);
}
console.timeEnd("100 files");
}
oneFile();
hundredFiles();
Those css files are each 111kb of this copy/pasted over and over:
body {
color: red;
}
My output (on my fast machine) is:
one file: 0.557ms
100 files: 15.627ms
It's pretty darn fast. Feel free to watch for yourself (I'm live streaming right now): https://www.youtube.com/watch?v=KPKBUBchCVo
So I think this is fine. Though if you'd like to add an opt-out option then I could do that. But I kinda think that should be some added complexity that we should add only if necessary since this doesn't seem to be much of a problem.
Ping? 😅
Hey, sorry for the slowness here; didn't want to merge until I had spare time to publish a new version.
No worries. I know life is busy 😅 Thanks for merging and releasing this!
could comparing files sizes first and only reading/comparing if different give a small perf improvement?
I expect it could, but it's it really worth the added complexity? For most cases it doesn't even take a full millisecond to just read and compare. It probably wouldn't make much of a difference...
Closes #320
This also solves a significant annoyance for anyone using postcss with an app that rebuilds on file change: https://github.com/remix-run/remix/issues/714