asu-kim / nntrainer

NNtrainer is Software Framework for Training Neural Network Models on Devices.
Apache License 2.0
0 stars 0 forks source link

Run LLAMA using NNTrainer #1

Open hokeun opened 6 months ago

hokeun commented 5 months ago

Installing Ko_KR_EUC-KR

To install ko_kr locale in Ubuntu, you can try these steps:
sudo apt-get install language-pack-ko
sudo apt-get install language-pack-ko-base
sudo apt-get install localepurge
Check ko_KR.EUC-KR
hokeun commented 5 months ago

https://wiki.archlinux.org/title/Localization/Korean

Enabling ko_KR.UTF-8

hokeun commented 5 months ago

Setting up ko_KR.UTF-8 as default

sudo update-locale LANG=ko_KR.UTF-8 LC_ALL=ko_KR.UTF-8

https://askubuntu.com/questions/868483/korean-locale-doesnt-work-on-ubuntu-headless-16-10

Deeksha-20-99 commented 5 months ago
  1. Files changes made before running the LLaMA model -file path: nntrainer/Applications/LLaMA/PyTorch run the llama_weights_converter.py file to generate the ./llama_fp16.bin files. (Hugging face LLaMA model) Save the file in the nntrainer/jni directory.

    • file path: nntrainer/Applications/LLaMA/jni/main.cpp add #define ENABLE_ENCODER2 in the beginning
    • file path:nntrainer/meson.build add message ('platform: @0@'.format(get_option('platform'))) in the 28th line of the code. add message ('enable-fp16: @0@'.format(get_option('enable-fp16'))) in the 68th line of the code
    • file path:nntrainer/meson_options.txt -enable the fp16 option as true in the 39th line option('enable-fp16', type: 'boolean', value: true)
  2. Run the meson build and ninja -C build command in the nntrainer directory

  3. path: $ROOT/jni enter the jni directory inside nntrainer and run ../build/Applications/LLaMA/jni/nntrainer_llama