Closed arikmaor closed 1 year ago
Should work in the browser already
I've created this repo so you can see my problem, perhaps you have a solution
It's a simple project initialized with create-react-app
and I'm trying to use tar-stream
in a very simple way
https://github.com/arikmaor/tar-stream-bug/commit/376276a22e16d156ae9879feb1f1fef83fc2e88a
const pack = tar.pack();
pack.on('data', (data) => console.log(data));
pack.entry({name: 'test.txt'}, 'test content');
pack.finalize();
During build I'm getting this error:
Module not found: Error: Can't resolve 'fs' in '/home/runner/work/tar-stream-bug/tar-stream-bug/node_modules/tar-stream'
You should just be able to do something like File.stream().pipeThrough(new CompressionStream('gzip'))
in browsers which support Compression API. To decompress .tar.gz archives in the browser I do this to extract node
executable for Node.js nightly release and get rid of npm
, node_modules
and everything else in the archive except node
https://github.com/guest271314/download-node-nightly-executable/blob/157899ca904421e8a5eac1d5e677efca82cff095/index.html#L286-L313
let [node_nightly_build] = await (
await fetch('https://nodejs.org/download/nightly/index.json')
).json();
let { version, files } = node_nightly_build;
let node_nightly_url = `https://nodejs.org/download/nightly/${version}/node-${version}-${osArch}.tar.gz`;
let url = `${cors_api_url}${node_nightly_url}`;
console.log(`Fetching ${node_nightly_url}`);
const request = (await fetch(url)).body.pipeThrough(
new DecompressionStream('gzip')
);
// Download gzipped tar file and get ArrayBuffer
const buffer = await new Response(request).arrayBuffer();
// Decompress gzip using pako
// Get ArrayBuffer from the Uint8Array pako returns
// const decompressed = await pako.inflate(buffer);
// Untar, js-untar returns a list of files
// (See https://github.com/InvokIT/js-untar#file-object for details)
const untarFileStream = new UntarFileStream(buffer);
while (untarFileStream.hasNext()) {
file = untarFileStream.next();
if (/\/bin\/node$/.test(file.name)) {
break;
}
}
writable = await fileSystemHandle.createWritable();
writer = writable.getWriter();
await writer.write(file.buffer);
await writer.close();
@guest271314 10x for the response but unfortunately that doesn't help me at all
CompressionStream
and DecompressionStream
are for gzip
so they are a bit out of scope for this repository and my issue, we're only dealing with tar
side of things
So I'm left with your suggestion to use UntarFileStream
, which is an alternative to this repository, that does not solve creating tar
files, only parsing existing files. I need to create a tar
file in the browser
@mafintosh have you a chance to look at my repo?
@mafintosh: Wouldn't this issue with b4a
indicate that this won't work in the browser?
that does not solve creating
tar
files, only parsing existing files. I need to create a tar file in the browser
I think that is possible using the source code in this repository. If we can parse it we can write it.
We can substitute navigator.storage.getDirectory()
or File API for fs
, Streams API for streamx
, TextEncoder
, TextEncoder
for encoding. The rest is just writing the algorithm without reliance on Node.js.
I first tried to bundle with esbuild
(I do not have node
or npm
installed)
$ package/bin/esbuild tar-stream-3.0.0/index.js --bundle --minify --sourcemap
✘ [ERROR] Could not resolve "fs"
tar-stream-3.0.0/pack.js:1:30:
1 │ const { constants } = require('fs')
╵ ~~~~
The package "fs" wasn't found on the file system but is built into node. Are
you trying to bundle for node? You can use "--platform=node" to do that, which
will remove this error.
✘ [ERROR] Could not resolve "streamx"
tar-stream-3.0.0/pack.js:2:39:
2 │ const { Readable, Writable } = require('streamx')
╵ ~~~~~~~~~
You can mark the path "streamx" as external to exclude it from the bundle,
which will remove this error. You can also surround this "require" call with a
try/catch block to handle this failure at run-time instead of bundle-time.
✘ [ERROR] Could not resolve "string_decoder"
tar-stream-3.0.0/pack.js:3:34:
3 │ const { StringDecoder } = require('string_decoder')
╵ ~~~~~~~~~~~~~~~~
The package "string_decoder" wasn't found on the file system but is built into
node. Are you trying to bundle for node? You can use "--platform=node" to do
that, which will remove this error.
✘ [ERROR] Could not resolve "b4a"
tar-stream-3.0.0/pack.js:4:20:
4 │ const b4a = require('b4a')
╵ ~~~~~
You can mark the path "b4a" as external to exclude it from the bundle, which
will remove this error. You can also surround this "require" call with a
try/catch block to handle this failure at run-time instead of bundle-time.
✘ [ERROR] Could not resolve "bl"
tar-stream-3.0.0/extract.js:1:19:
1 │ const bl = require('bl')
╵ ~~~~
You can mark the path "bl" as external to exclude it from the bundle, which
will remove this error. You can also surround this "require" call with a
try/catch block to handle this failure at run-time instead of bundle-time.
✘ [ERROR] Could not resolve "streamx"
tar-stream-3.0.0/extract.js:2:42:
2 │ const { Writable, PassThrough } = require('streamx')
╵ ~~~~~~~~~
You can mark the path "streamx" as external to exclude it from the bundle,
which will remove this error. You can also surround this "require" call with a
try/catch block to handle this failure at run-time instead of bundle-time.
6 of 7 errors shown (disable the message limit with --log-limit=0)
Then I bundled online with bundlejs https://bundlejs.com/?q=tar-stream
:12 Uncaught Error: Dynamic require of "stream" is not supported
at VM399 output1.js:12:9
at http-url:https://unpkg.com/readable-stream@%5E4.2.0/lib/ours/index.js (VM399 output1.js:5722:18)
at __require2 (VM399 output1.js:15:50)
at http-url:https://unpkg.com/bl@%5E6.0.0/bl.js (VM399 output1.js:6141:24)
at __require2 (VM399 output1.js:15:50)
at http-url:https://unpkg.com/tar-stream@3.0.0/extract (VM399 output1.js:7600:14)
at __require2 (VM399 output1.js:15:50)
at http-url:https://unpkg.com/tar-stream@latest/index.js (VM399 output1.js:8029:23)
at __require2 (VM399 output1.js:15:50)
at VM399 output1.js:8036:35
(anonymous) @ V
Node.js specific 'streams' are used in this repository, so before diving in to substituting Streams API for Node.js specific 'streams', et al. modules I looked around and found this https://github.com/ankitrohatgi/tarballjs. It works as described https://plnkr.co/edit/9hx7n4WJejF4y8dE?open=lib%2Fscript.js&preview.
To use Streams API with tarballjs example plnkr linked above you can do something like this
let tarWriter = new tarball.TarWriter();
tarWriter.addFolder('myfolder/');
tarWriter.addFile(
'stream.txt',
await new Response(
new ReadableStream({
async pull(controller) {
for (const char of 'stream') {
controller.enqueue(char);
await new Promise((r) => setTimeout(r, 100));
}
controller.close();
},
}).pipeThrough(new TextEncoderStream())
).blob()
);
// ...
generateTar(0)
.then((tarReader) => {
const file = tarReader.getTextFile('stream.txt');
console.log(file);
})
.catch(console.error);
👋 Just want to add my +1 for restoring browser support to this library, we are also running into https://github.com/holepunchto/b4a/issues/6
Fixed in the next release that will drop later today
Should be fixed in 3.1.0
I want to use this awesome package in the browser as well I seems to me it once worked (looking at your
tar-browser
repository) Is there any way to bring it back?