Open emlautarom1 opened 7 months ago
Note that we tried to force the async
code to block by wrapping all promises (ex. fetch
) in a busy wait loop:
function block<T>(p: Promise<T>): T {
let value: T;
p.then(v => value = v);
while (value! === undefined) { /* Unlucky busy wait =( */ }
return value;
}
Besides this blocking the main thread (thus making the UI unresponsive), we did not get any output from the compilation process which makes us think that we're doing something wrong.
How about fetching the cached values before running compile()
, and then your Cache
just provides those values synchronously?
Note that we tried to force the async code to block by wrapping all promises (ex. fetch) in a busy wait loop:
busy waiting like that doesn't work, because your busy loop prevents the next microtask to start so the promise is never triggered. promises in JS can't be made synchronous
We are not quite sure if this is even possible
It's possible but annoying 😅 Needs another Pickles refactor
busy waiting like that doesn't work, because your busy loop prevents the next microtask to start so the promise is never triggered. promises in JS can't be made synchronous
I figured that this would probably not work but it was worth the try.
How about fetching the cached values before running compile(), and then your Cache just provides those values synchronously?
We're exploring this approach but the cached files are ~2.1 GB so I don't think we can store it in a Map
or similar, which means that we need to leverage some kind of Storage mechanism:
localStorage
and sessionStorage
) only support up to ~10 MB at best.This leaves us with IndexedDB
, CacheAPI
and Origin Private File System
. The first two only provide Promise
based APIs which we've already rules out, and while OPFS
does provide a "sync" API (see: https://developer.mozilla.org/en-US/docs/Web/API/File_System_API/Origin_private_file_system#manipulating_the_opfs_from_a_web_worker) it is intended to be used from a WebWorker which does not expose a "sync" API (it relies on message passing).
We're exploring this approach but the cached files are ~2.1 GB so I don't think we can store it in a Map or similar
As a first iteration, I'd just store them in memory
As a first iteration, I'd just store them in memory
To my surprise Firefox does not complain. It seems like this approach works so we'll go with it for the time being.
Thanks @dfstio, we ended up with something quite similar to what is shared in Discord except for the fact that we have 63 files in the cache folder instead of just 10.
The following discussion can also be of interest to you. You can first sign the tx on the web without compiling or proving and then do the compilation, proving, and sending the tx on your server subject to AuroWallet adding a new API method: https://discord.com/channels/484437221055922177/1228326948078489642/1228403957752397905
The changes required in Auro Wallet for it to work are much smaller than Pickles refactoring.
Thanks @dfstio, we ended up with something quite similar to what is shared in Discord except for the fact that we have 63 files in the cache folder instead of just 10.
You don't need all the files on web - some files are created very fast on the fly, so you can download a subset of the files generated during compiling. https://discord.com/channels/484437221055922177/1171938451193593856/1174766167982886952
do the compilation, proving, and sending the tx on your server
For our use case we want to keep as many steps as possible in the browser.
Hi there, I made the below straightforward implementation for such issue, in my use-case: You can use XMLHttpRequest with sync option (set third param to false) you can modify it to your use-case, the logger thing is because I used it in a worker env.
import { CacheHeader } from 'o1js';
import { serializeError } from 'serialize-error';
function loadAssetAsUint8Array(url: string, log: (message: string) => void) {
try {
const request = new XMLHttpRequest();
request.open("GET", url, false); // false makes it synchronous
request.overrideMimeType("application/octet-stream"); // Ensures response is treated as octet stream
request.send();
// console.log('request status is ', request.status);
if (request.status === 200) {
return new Uint8Array(request.response);
} else {
return undefined;
}
} catch (er) {
log('Error loading asset: ' + JSON.stringify(serializeError(er)));
return undefined;
}
}
function loadTextFile(url: string, log: (message: string) => void) {
try {
const request = new XMLHttpRequest();
request.open("GET", url, false); // false makes it synchronous
request.overrideMimeType("text/plain"); // Ensures response is treated as plain text
request.send();
if (request.status === 200) {
return request.responseText;
} else {
return undefined;
}
} catch (er) {
log('Error loading asset: ' + JSON.stringify(serializeError(er)));
return undefined;
}
}
export class FileCache {
private baseUrl: string;
private logger: (message: string) => void;
public readonly canWrite: boolean = false
constructor(logger: (message: string) => void, baseUrl: string = '/caches/') {
if (!baseUrl.endsWith('/')) {
baseUrl += '/';
}
this.baseUrl = baseUrl;
this.logger = logger;
}
public read({ persistentId, uniqueId, dataType }: CacheHeader) {
const currentId = loadTextFile(`${this.baseUrl}${persistentId}.header`, this.logger);
this.logger(`cache read: ${persistentId} ${uniqueId} ${dataType} read currentId ${currentId} `);
if (!currentId) return;
if (currentId !== uniqueId) return;
this.logger('uniqueId matched');
if (dataType === 'string') {
let string = loadTextFile(`${this.baseUrl}${persistentId}`, this.logger);
if (!string) return;
this.logger('cache read string');
return new TextEncoder().encode(string);
} else {
this.logger('loadAssetAsUint8Array called');
let buffer = loadAssetAsUint8Array(`${this.baseUrl}${persistentId}`, this.logger);
if (!buffer) { this.logger('failed to dl buffer'); return; };
this.logger('cache read buffer');
return new Uint8Array(buffer);
}
}
public write() {
this.logger('cache write not implemented');
}
}
The compilation step is usually very taxing on machine resources so we would like to avoid it when possible. Currently, a
Cache
interface is defined and an implementation based on the filesystem is provided when running under Node.JS:https://github.com/o1-labs/o1js/blob/fd7bd4b02f4f7cc1057cd423261d70c22275763a/src/lib/proof-system/cache.ts#L31-L59
Currently, there is no
Cache
implementation that works in the browser (the environment where we expect to have end users), so we defined our own:This implementation relies on a server where compilation of the proofs was already performed and stored in the filesystem. On the browser we make a network request for the compiled artifacts on
read
and reject allwrite
operations (we don't want to alter the server filesystem).Security aside (can we actually trust the server cached files?), this implementation does not compile due to
Cache
being synchronous, making it impossible to useasync/await
:https://github.com/o1-labs/o1js/blob/fd7bd4b02f4f7cc1057cd423261d70c22275763a/src/lib/proof-system/cache.ts#L37
Note how the current filesystem implementation uses sync operations:
https://github.com/o1-labs/o1js/blob/fd7bd4b02f4f7cc1057cd423261d70c22275763a/src/lib/proof-system/cache.ts#L176
We would like for the
Cache
interface to be async by default:We are not quite sure if this is even possible, considering that this cache is eventually used in a synchronous context:
https://github.com/o1-labs/o1js-bindings/blob/177fb399d85ef4fab10d1ff26670da5a7de59450/crypto/bindings/srs.ts#L100
If this is not possible we would appreciate any hint on how to use
async
code inside aCache
implementation.