jerryscript-project / jerryscript

Ultra-lightweight JavaScript engine for the Internet of Things.
https://jerryscript.net
Apache License 2.0
6.92k stars 669 forks source link

Is it possible to store the JS program in bytecode as a const in ROM? #1884

Closed cnnblike closed 7 years ago

cnnblike commented 7 years ago

Hi, I'm working on a project which need a Javascript engine as a replacement of configuration file. Since the SRAM on my board is really limited, while there are still a some ROM, I am wondering if I could port only the vm part to my firmware and cut out the parser part. In this way, I could cut down the ROM space of parser part and save the SRAM that was used to save bytecode.

So, it's basically two questions:

  1. Is it possible to port VM only?
  2. Is it possible to store bytecode as a const?
zherczeg commented 7 years ago

JerryScript is a modular JS engine. Almost any feature can be disabled.

To disable the parser, set: #define JERRY_JS_PARSER 0 or pass --js-parser off to tools/build.py

You can execute byte-code from memory, just align the buffer to 4 bytes. The byte code can also be run from ROM without loading it into the RAM.

cnnblike commented 7 years ago

Thank you, so the bytecode is platform-independent, right? Meaning I could parse it on x86 platform and use it on armv7?

zherczeg commented 7 years ago

As long as the byte order of the two machines are the same (little endian in this case, so it works), and you use the same compressed pointer size (e.g. 16 bit).

cnnblike commented 7 years ago

@zherczeg As for the ROM usage problem, I just did a simple test and get some interesting result . Before adding jerryscript: text 23980 data: 244 bss:5332 after adding jerryscript(just use VM module): text: 150904 data: 348 bss: 28128 after adding jerryscript(with VM and parser): text: 176064 data: 348 bss: 28128

Here is my question: so the parser actually doesn't take much ROM, right? Or did I make anything wrong? How could I continue decrease ROM usage? It's just opposite from how I thought it would be - I thought the parser would cost much more than the vm do.

this is how i build my VM-only version of libjerryscript:

python tools/build.py --toolchain=cmake/toolchain_mcu_stm32f1.cmake --jerry-cmdline OFF --jerry-libc OFF --jerry-libm OFF --js-parser OFF --mem-heap=20 --jerry-port-default OFF --clean

parser+VM:

python tools/build.py --toolchain=cmake/toolchain_mcu_stm32f1.cmake --jerry-cmdline OFF --jerry-libc OFF --jerry-libm OFF --js-parser ON --mem-heap=20 --jerry-port-default OFF --clean
zherczeg commented 7 years ago

What is your platform? On arm thumb2 the whole engine is around 140K, and can be reduced to around 80K as far as I remember (Please use -Os optimization). You can disable a lot of features, please check jerry-core/profiles/minimal.profile

cnnblike commented 7 years ago

@zherczeg Thank you a lot for replying! I'm now building for stm32f103vct6, a M3 with 48kB SRAM and 256kB Flash. After some troubleshooting, the problem might lies in jerry-ext, I forgot to disable this part.

if I disable js-parser, jerry-ext, jerry-libm, jerry-libc, jerry-port-default, use the minimal profile , use the build option Os, This is interesting, because from my build, the minimal.profile really helped a lot, while abandon js-parser doesn't. with minimal profile. the size is like the following: bare-HAL: 24k bare-HAL + jerry-vm = 67k bare-HAL + jerry-vm + js-parser = 91k

This is the toolchain file I used for building:

include(CMakeForceCompiler)
set(CMAKE_SYSTEM_NAME MCU)
set(CMAKE_SYSTEM_PROCESSOR armv7l)
set(CMAKE_SYSTEM_VERSION STM32F1)
set(FLAGS_COMMON_ARCH -mlittle-endian -mthumb -mcpu=cortex-m3 -march=armv7-m)
CMAKE_FORCE_C_COMPILER(arm-none-eabi-gcc GNU)

-- update: This is the key is JUST like what you said, is to use a minimal profile, instead of a es5.1 profile. with jerry-vm+es5.1profile, it's 150k with jerry-vm+js-parser+es5.1profile, it's 170k