Open KaKi87 opened 5 months ago
Hey @KaKi87 ๐ Thanks for the feature requests!
- Tauri support :
After a quick search, I'm unsure if Tauri is comparable to the other supported JavaScript runtimes in the library (e.g. Node.js, Deno, Bun). However, the linked fs
modules do appear inline with the goal to normalize these types of APIs. I'll need more time to explore Tauri and suggest holding off on this feature until we determine the addition is inline with the Project Goals or if those goals should be refined.
- A
readLines
method for reading a file line by line :- An
exec
method for running a command and returning an output :- A
spawn
method for running a command and returning a stream :
These all sound like great additions to the library ๐
- A
writeBinaryFile
method for writing binary files :- A
readBinaryFile
method for reading binary files : [...] Additionally, it would be nice to : havewriteTextFile
&readTextFile
aliases forwriteFile
&readFile
for consistency purposes ;
I'm not opposed to this, however, another goal of cross-runtime
is to reduce the number of variations to do the same operation. I would prefer standardizing on using the encoding
option for readFile
and writeFile
for defining the return value type. Thoughts?
use Bun.file().text() in
readFile
instead of relying on Bun's Node polyfill.
Agreed ๐
If you'd like, I'm willing to make a PR for everything listed here, with JS and JSDoc, but not TS though.
Great to hear! Contributions are very welcome ๐ Feel free to start with readLine
, exec
, and spawn
since we have alignment there. Happy to help with the API designs or discuss any of the feedback further.
After a quick search, I'm unsure if Tauri is comparable to the other supported JavaScript runtimes in the library (e.g. Node.js, Deno, Bun).
It's not a runtime indeed, but it does match this library's use cases : manipulating files (and soon, processes) using a unified low overhead API.
I would prefer standardizing on using the encoding option for readFile and writeFile for defining the return value type. Thoughts?
Agreed, but then those methods would no longer have utf8
as default value, making this change a breaking one.
I'm okay with it if you are.
I would prefer standardizing on using the encoding option for readFile and writeFile for defining the return value type. Thoughts?
Agreed, but then those methods would no longer have utf8 as default value, making this change a breaking one.
I'm okay with it if you are.
Yes, I'm okay with the breaking change ๐
I would prefer standardizing on using the encoding option
I was thinking on this more and wondering if the encoding
convention can be challenged. My thought process was that if omitting encoding
or setting undefined
returns a Uint8Array
, I'd also like there to be an explicit option for defining that return type (similar to Node.js readFile
encoding: null
option returning Buffer
). That said, I think null
is too abstract and would prefer an explicit string identifier (unrelated but similar to Node.js child_process
encoding: 'buffer'
option). However, in our context using encoding: 'uint8array'
for example would be conflating the option since we're basically bypassing decoding and returning the raw binary data.
I'm curious what you think about deprecating encoding
and introducing an as
property for defining both the return type and optional encode/decode behavior. For example:
const foo = await readFile('foo.bin')
// ^? Uint8Array
const bar = await readFile('bar.bin', { as: 'uint8array' })
// ^? Uint8Array
const baz = await readFile('baz.txt', { as: 'utf8' })
// ^? string
await writeFile('foo.bin', new TextEncoder().encode('data'))
await writeFile('bar.bin', new TextEncoder().encode('data'), { as: 'uint8array' })
await writeFile('baz.txt', 'data', { as: 'utf8' })
I'd rather keep encoding
for writeFile
and pass it as-is to runtimes thus letting these handle it (less effort and bug risks for us), and only use as
for readFile
to convert outputs to the user's choice.
What do you think ?
I'd rather keep
encoding
forwriteFile
and pass it as-is to runtimes thus letting these handle it (less effort and bug risks for us), and only useas
forreadFile
to convert outputs to the user's choice.
Okay, I agree. Using as
in writeFile
was a bit of a stretch for alignment and consistency with readFile
. That said, I'd rather not diverge and have two different properties for similar operations. I suggest we keep encoding
for both APIs where readFile
omits or sets undefined
to return a Uint8Array
. For example:
const foo = await readFile('foo.bin')
// ^? Uint8Array
const bar = await readFile('bar.bin', { encoding: undefined })
// ^? Uint8Array
const baz = await readFile('baz.txt', { encoding: 'utf-8' })
// ^? string
await writeFile('foo.bin', new TextEncoder().encode('data'))
await writeFile('bar.bin', new TextEncoder().encode('data'), { encoding: undefined })
// Note: Node.js silently ignores this unused `encoding` option, whereas, I lean towards throwing
await writeFile('expect-error.bin', new TextEncoder().encode('data'), { encoding: 'utf-8' })
await writeFile('baz.txt', 'data', { encoding: 'utf-8' })
I suggest we keep
encoding
for both APIs
Agreed.
where
readFile
omits or setsundefined
to return aUint8Array
.
Why not ArrayBuffer
though ? Isn't that the most used type for file manipulation ?
Note: Node.js silently ignores this unused
encoding
option, whereas, I lean towards throwing
Okay.
where
readFile
omits or setsundefined
to return aUint8Array
.Why not
ArrayBuffer
though ? Isn't that the most used type for file manipulation ?
Good callout. I originally went with Uint8Array
to avoid diverging too much from Node.js and Deno's readFile
(note: while Node.js returns a Buffer
it is a subclass of Uint8Array
). That said, I'm happy to move forward with ArrayBuffer
for its broader utility ๐
Alright, did you make a decision on Tauri ?
Yes, let's proceed with adding support for Tauri.
Alright, here's one last thing โ I noticed that you statically import wrappers inside which you dynamically import runtime dependencies :
ยท
โโโ import { readFile, writeFile } from './index.js'
โโโ export { readFile } from './readFile.js'
โ โโโ { readFile } = import('node:fs/promises')
โ โ โโโ readFile()
โ โโโ Deno.readTextFile()
โโโ export { writeFile } from './writeFile.js'
โโโ { writeFile } = import('node:fs/promises')
โ โโโ writeFile()
โโโ Deno.writeTextFile()
Although dynamic imports are unavoidable for being cross-runtime, I would like to suggest to dynamically import wrappers inside which statically import runtime dependencies :
ยท
โโโ import { readFile, writeFile } from './index.js'
โโโ { readFile, writeFile } = import('./node.js')
โ โโโ import { readFile, writeFile } from 'node:fs/promises'
โ โ โโโ readFile()
โ โ โโโ writeFile()
โโโ { readFile, writeFile } = import('./deno.js')
โโโ Deno.readTextFile()
โโโ Deno.writeTextFile()
This opposite way would allow not to require dynamic imports when the user knows which runtime they're using, and directly import it :
import { readFile, writeFile } from './node.js'
What do you think ?
Great suggestion ๐ I had a similar follow up task in mind to separate integrations into individual packages under the @cross-runtime/*
scope e.g.
/**
* @cross-runtime/types
*/
export type ReadFile = (
path: string | URL,
options: ReadFileOptions
) => Promise<ReadFileResult>
export type WriteFile = (
path: string | URL,
data: WriteFileData,
options: WriteFileOptions
) => Promise<void>
/**
* @cross-runtime/node
*/
import type {ReadFile, WriteFile} from '@cross-runtime/types'
import * as fs from 'node:fs/promises'
export const readFile: ReadFile = (path, options) => fs.readFile(...)
export const writeFile: WriteFile = (path, data, options) => fs.writeFile(...)
/**
* @cross-runtime/deno
*/
import type {ReadFile, WriteFile} from '@cross-runtime/types'
export const readFile: ReadFile = (path, options) => options?.encoding
? Deno.readTextFile(...)
: Deno.readFile(...)
export const writeFile: WriteFile = (path, data, options) => typeof data === 'string'
? Deno.writeTextFile(...)
: Deno.writeFile(...)
/**
* @cross-runtime/bun
*/
import type {ReadFile, WriteFile} from '@cross-runtime/types'
...
/**
* cross-runtime
*/
export const {readFile, writeFile} = await (() => {
if (isNode()) return import('@cross-runtime/node')
if (isDeno()) return import('@cross-runtime/deno')
if (isBun()) return import('@cross-runtime/bun')
// ...
throw new Error('Unsupported runtime')
})()
import {readFile, writeFile} from 'cross-runtime'
// or
import {readFile, writeFile} from '@cross-runtime/node'
That said, feel free to implement your suggestions as is and I can follow up with the restructure later. Otherwise, I'm happy to scaffold up these scoped packages now and we can continue these features after.
I'd prefer not to do that though, because it complicates development unnecessarily, against the KISS principle.
To be honest, I already find a little frustrating that your project is part of a monorepo containing all your other projects that I have no interest in yet have to have on my computer ๐
Hello,
It would be nice to add the following features to
cross-runtime
:isTauri
usingwindow.__TAURI__
;writeFile
usingwriteTextFile
from@tauri-apps/api/fs
;readFile
usingreadTextFile
from@tauri-apps/api/fs
.readLines
method for reading a file line by line :createInterface
fromnode:readline
&createReadStream
fromnode:fs
;exec
method for running a command and returning an output :Deno.Command().output()
;exec
fromnode:child_process
;$`โ`.quiet()
frombun
;new Command().execute()
from@tauri-apps/api/shell
.spawn
method for running a command and returning a stream :Deno.Command().spawn()
;spawn
fromnode:child_process
;Bun.spawn()
;new Command().spawn()
from@tauri-apps/api/shell
.writeBinaryFile
method for writing binary files :Deno.writeFile()
taking anUint8Array
;encoding
=null
taking aBuffer
or anyTypedArray
;Bun.write()
taking anArrayBuffer
, aBlob
or anyTypedArray
;writeBinaryFile
from@tauri-apps/api/fs
taking anArrayBuffer
;readBinaryFile
method for reading binary files :Deno.readFile()
returning anUint8Array
;encoding
=null
returning aBuffer
;Bun.file().arrayBuffer()
returning anArrayBuffer
;readBinaryFile
from@tauri-apps/api/fs
returning anUint8Array
;ArrayBuffer
by default.Additionally, it would be nice to :
writeTextFile
&readTextFile
aliases forwriteFile
&readFile
for consistency purposes ;Bun.file().text()
inreadFile
instead of relying on Bun's Node polyfill.If you'd like, I'm willing to make a PR for everything listed here, with JS and JSDoc, but not TS though.
Thanks