tyjvazum / arb

◉ Arbitrary Protocols on top of Bitcoin (Bitcoin NFTs/Ordinals, BRC-20, & Bitcoin Identities/Usernames)
GNU General Public License v3.0
21 stars 1 forks source link

Feature: ML-based PNG Compression #15

Open redactedJare opened 1 year ago

redactedJare commented 1 year ago

https://github.com/casey/ord/pull/2055

https://github.com/casey/ord/pull/2103

Notably other compression methodologies should be used for other media types - brotli is great at 3d models and text for e.g.

Also the commits don't need to know about their pending reveals beforehand - leading me to think of two products off the hop,

candy machine (entirely in browser)

https://github.com/staccdotsol/ord/tree/features/cmv2

dead man's switch

think of a merkle tree where an ordinal is a secret and not committed or signed until a given timelock interval is not paid. The creator provides the data, et voile.

tyjvazum commented 1 year ago

@redactedJare, thank you for opening the issue. I'm sorry to hear about your hospital situation. I hope everything resolves smoothly.

This all sounds interesting. I'll invest some time learning about it. In particular, the ML-based PNG compression sounds awesome and I'm glad you were able to bring it to my attention. I'm wondering how much compute it takes. Ideally it'd be possible without a GPU, even if it's slow in that mode.

redactedJare commented 1 year ago

All good remedying slowly and surely.

There's better news: this is rust, we can multithread out to many (slower) processes and have them run the magick. Whenever I have a machine for a little while I'll try to bench for you. If the goal is to respect the 400k max then it should be almost neglibile.

... I know someone who had at some point started wasm work towards multimachine crunching along these veins.

Mind, it's worth nothing solana devnet is the ultimate size compressor for any file of any kind. If you have an engineer that can extract the tensorflow compress/decompress functions, implement slices for e.g. and some other features into seahorse-lang.org and then replicate this https://twitter.com/aeyakovenko/status/1655624489332531200 except that it constantly retrains itself on any new onchain datatypes et voila

tyjvazum commented 1 year ago

I think a minimal set up that runs on, or at least decompresses on, typical hardware and is packaged in some way to be distributed alongside the Rust binary should be the initial goal. If that can be accomplished then additional work could be integrated later. Just the compression performance for PNG sounds super useful if it can be designed to work well with the existing project structure.

redactedJare commented 1 year ago

I will tell you in the current implementation:

  1. the compression happens in a spawned thread,
  2. the initial decompression stores a inscription id #.png to /tmp/
  3. then whatever garbage collection can happen

I have no computer for a few more days