Closed beautiful-boyyy closed 9 months ago
Hi, I don't have a M2 MacOS now, could you please try to add an option --size-level=1
in https://github.com/web-devkits/Wasmnizer-ts/blob/2f07034c74a0df2d4b3e6c8f069a7dcc33039048/tests/benchmark/run_benchmark.js#L184
and try again?
e.g.
execSync(`${wamrc} --size-level=1 --enable-gc -o ${prefix}.aot ${prefix}.wasm > tmp.txt`);
Hi, I don't have a M2 MacOS now, could you please try to add an option
--size-level=1
inand try again? e.g.
execSync(`${wamrc} --size-level=1 --enable-gc -o ${prefix}.aot ${prefix}.wasm > tmp.txt`);
I added it in run_benchmark.js, same error. No "size-level" loged out, It's weird.
Hi, I don't have a M2 MacOS now, could you please try to add an option
--size-level=1
inand try again? e.g.
execSync(`${wamrc} --size-level=1 --enable-gc -o ${prefix}.aot ${prefix}.wasm > tmp.txt`);
I modified the run.sh, same error
LLVM ERROR: Only small, tiny and large code models are allowed on AArch64
./run.sh: line 46: 62318 Abort trap: 6 $wamrc --size-level=1 --enable-gc -o $prefix.aot $prefix.wasm > tmp.txt
Read file to buffer failed: open file binarytrees_class.aot failed.
It's strange, the default code model is LLVMCodeModelSmall
, which should be allowed.
I noticed that --size-level=0
represents LLVMCodeModelLarge
, could you try --size-level=0
?
It's strange, the default code model is
LLVMCodeModelSmall
, which should be allowed.I noticed that
--size-level=0
representsLLVMCodeModelLarge
, could you try--size-level=0
?
thanks, solved
sys: macos 14.2 arm64