baaivision / Emu

Emu Series: Generative Multimodal Models from BAAI
https://baaivision.github.io/emu2/
Apache License 2.0
1.66k stars 86 forks source link

Environment installation failed #88

Closed sugary199 closed 8 months ago

sugary199 commented 8 months ago

Hi, I encountered the following error when executing the command'pip install -r requirement.txt'

warning: variable does not need to be mutable
         --> tokenizers-lib/src/models/unigram/model.rs:265:21
          |
      265 |                 let mut target_node = &mut best_path_ends_at[key_pos];
          |                     ----^^^^^^^^^^^
          |                     |
          |                     help: remove this `mut`
          |
          = note: `#[warn(unused_mut)]` on by default

      warning: variable does not need to be mutable
         --> tokenizers-lib/src/models/unigram/model.rs:282:21
          |
      282 |                 let mut target_node = &mut best_path_ends_at[starts_at + mblen];
          |                     ----^^^^^^^^^^^
          |                     |
          |                     help: remove this `mut`

      warning: variable does not need to be mutable
         --> tokenizers-lib/src/pre_tokenizers/byte_level.rs:200:59
          |
      200 |     encoding.process_tokens_with_offsets_mut(|(i, (token, mut offsets))| {
          |                                                           ----^^^^^^^
          |                                                           |
          |                                                           help: remove this `mut`

      error: casting `&T` to `&mut T` is undefined behavior, even if the reference is unused, consider instead using an `UnsafeCell`
         --> tokenizers-lib/src/models/bpe/trainer.rs:526:47
          |
      522 |                     let w = &words[*i] as *const _ as *mut _;
          |                             -------------------------------- casting happend here
      ...
      526 |                         let word: &mut Word = &mut (*w);
          |                                               ^^^^^^^^^
          |
          = note: for more information, visit <https://doc.rust-lang.org/book/ch15-05-interior-mutability.html>
          = note: `#[deny(invalid_reference_casting)]` on by default

      warning: `tokenizers` (lib) generated 3 warnings
      error: could not compile `tokenizers` (lib) due to 1 previous error; 3 warnings emitted

      Caused by:
        process didn't exit successfully: `/ML-A100/team/mm/shuyu/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/bin/rustc --crate-name tokenizers --edition=2018 tokenizers-lib/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts,future-incompat --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -C embed-bitcode=no --cfg 'feature="cached-path"' --cfg 'feature="clap"' --cfg 'feature="cli"' --cfg 'feature="default"' --cfg 'feature="dirs"' --cfg 'feature="esaxx_fast"' --cfg 'feature="http"' --cfg 'feature="indicatif"' --cfg 'feature="onig"' --cfg 'feature="progressbar"' --cfg 'feature="reqwest"' -C metadata=8e2752dd4241bb89 -C extra-filename=-8e2752dd4241bb89 --out-dir /ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps -L dependency=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps --extern aho_corasick=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libaho_corasick-f7a038e16c9c3a33.rmeta --extern cached_path=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libcached_path-87dfd851e70eab20.rmeta --extern clap=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libclap-17aee225e7737da1.rmeta --extern derive_builder=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libderive_builder-03a317b5faf23d15.rmeta --extern dirs=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libdirs-3bbbfe171dd5d1fb.rmeta --extern esaxx_rs=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libesaxx_rs-e8f5ee65641cadd4.rmeta --extern getrandom=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libgetrandom-a6a1829d45933b10.rmeta --extern indicatif=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libindicatif-fb7f0662fc86811d.rmeta --extern itertools=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libitertools-051f3c77bf3684bc.rmeta --extern lazy_static=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/liblazy_static-df89fd9b4b197d62.rmeta --extern log=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/liblog-db5663930c6645cc.rmeta --extern macro_rules_attribute=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libmacro_rules_attribute-39217440fe533929.rmeta --extern monostate=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libmonostate-8c58aaddd40ad12e.rmeta --extern onig=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libonig-29c4276bf6de47c5.rmeta --extern paste=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libpaste-d49a9dbb21c90081.so --extern rand=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/librand-cc0ca7274e36266b.rmeta --extern rayon=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/librayon-29cb179ffa5164fd.rmeta --extern rayon_cond=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/librayon_cond-1eb0f603caa083b5.rmeta --extern regex=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libregex-18688a20a38bb851.rmeta --extern regex_syntax=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libregex_syntax-ace402a25abfd585.rmeta --extern reqwest=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libreqwest-b4f2219f9303d269.rmeta --extern serde=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libserde-e56507730b04aa65.rmeta --extern serde_json=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libserde_json-464ae217253931c4.rmeta --extern spm_precompiled=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libspm_precompiled-cdbc638df613f6cb.rmeta --extern thiserror=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libthiserror-b6f45b4800d2a038.rmeta --extern unicode_normalization_alignments=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libunicode_normalization_alignments-a1711ea2b5cfdc20.rmeta --extern unicode_segmentation=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libunicode_segmentation-0df53fbf44393ad7.rmeta --extern unicode_categories=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/deps/libunicode_categories-7c6fabd07afa2a56.rmeta -L native=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/build/bzip2-sys-6a87a5e2594af3be/out/lib -L native=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/build/zstd-sys-0be1ccab0d66684e/out -L native=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/build/esaxx-rs-2a487938ec0bc27f/out -L native=/ML-A100/team/mm/shuyu/tmp/pip-install-e309bck9/tokenizers_3e4131e9d4584f9cbbc890fca6ffbe2c/target/release/build/onig_sys-db2a9288d25c25cb/out` (exit status: 1)
      error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --` failed with code 101
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers

Do you know how to fix it ?

ryanzhangfan commented 8 months ago

There might be something wrong like the conflicts between newer version of rust complier and the required version of tokenizer. Please refer to this issue to see if the error can be fixed. Or you might google it for more possible solutions.

sugary199 commented 8 months ago

thank you.I tried rust version downgrading but failed. python version 3.12->3.11.8 works