mlc-ai / tokenizers-cpp

Universal cross-platform tokenizers binding to HF and sentencepiece
Apache License 2.0
211 stars 47 forks source link

An error occurred in the compilation #8

Closed Liu-xiandong closed 11 months ago

Liu-xiandong commented 11 months ago

When I compile, I get an error that shows:

[  6%] Generating release/libtokenizers_c.a
No such file or directory
CMakeFiles/tokenizers_c.dir/build.make:70: recipe for target 'release/libtokenizers_c.a' failed
make[2]: *** [release/libtokenizers_c.a] Error 1
CMakeFiles/Makefile2:173: recipe for target 'CMakeFiles/tokenizers_c.dir/all' failed
make[1]: *** [CMakeFiles/tokenizers_c.dir/all] Error 2
Makefile:155: recipe for target 'all' failed
make: *** [all] Error 2

I checked where the error was in Makefile, which is the following line

release/libtokenizers_c.a:
    @$(CMAKE_COMMAND) -E cmake_echo_color "--switch=$(COLOR)" --blue --bold --progress-dir=workspace/LLM/tokenizers-cpp/build/CMakeFiles --progress-num=$(CMAKE_PROGRESS_1) "Generating release/libtokenizers_c.a"
    cd /workspace/LLM/tokenizers-cpp/rust && /usr/local/lib/python3.8/site-packages/cmake/data/bin/cmake -E env CARGO_TARGET_DIR=/workspace/LLM/tokenizers-cpp/build RUSTFLAGS="" cargo build --release
    cd /workspace/LLM/tokenizers-cpp/rust && /usr/local/lib/python3.8/site-packages/cmake/data/bin/cmake -E copy /workspace/LLM/tokenizers-cpp/build/release/libtokenizers_c.a /workspace/LLM/tokenizers-cpp/build

What other environment do I need to configure?

Liu-xiandong commented 11 months ago

cargo and related rust environment is needed.