Open chadbrewbaker opened 11 months ago
Perfect ticket for @gdb. Looks like it is based on TinyEMU.
To get this project FedRamp certified, might make sense to work with https://github.com/amazonlinux/amazon-linux-2023/issues/102 and have FedRAMP approved AWSLinux2023 VM an image option for those in regulated environments.
@copy project seems far better maintained than jslinux. Just need a stripped down Ubuntu 20.04 x86 image to be binary compatible with OpenAI. https://github.com/copy/v86
Digging into the guts of jslinux, it seems to open a websocket and use https://relay.widgetry.org to proxy connections to get curl etc working? Probably nice to have a similar simple proxy you can host on a static URI - @timclicks video?
Will try to get something up.
https://bellard.org/jslinux/ - yes curl works. Need to re-package so WASM GPU LLM can talk to jslinux all browser side.
100% should run in the browser.
If you want CodeInterpereter parity it should load Ubuntu 20.04 amd64, each LLM session times out after 60 seconds and is ran in new iPython notebook. This is for security from LLM attacks on the sandbox. Lock it down to whitelisted URIs - for now I would whitelist HTTP/HTTPS GET *