Open projecthorizon993 opened 3 months ago
+1, Gentoo linux, analog issue, installation step fail
Compiling tokenizers v0.13.3 (/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/tokenizers-lib)
Running `rustc --crate-name tokenizers --edition=2018 tokenizers-lib/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts,future-incompat --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -C embed-bitcode=no --cfg 'feature="cached-path"' --cfg 'feature="clap"' --cfg 'feature="cli"' --cfg 'feature="default"' --cfg 'feature="dirs"' --cfg 'feature="esaxx_fast"' --cfg 'feature="http"' --cfg 'feature="indicatif"' --cfg 'feature="onig"' --cfg 'feature="progressbar"' --cfg 'feature="reqwest"' -C metadata=89b09084cb326b58 -C extra-filename=-89b09084cb326b58 --out-dir /tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps -C strip=debuginfo -L dependency=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps --extern aho_corasick=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libaho_corasick-8ba363174647299e.rmeta --extern cached_path=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libcached_path-6bfc0561b4dbd446.rmeta --extern clap=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libclap-540591df4bacc00c.rmeta --extern derive_builder=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libderive_builder-24b0e15fc888a13f.rmeta --extern dirs=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libdirs-9b2b450d525477fc.rmeta --extern esaxx_rs=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libesaxx_rs-e2b589202c958bcf.rmeta --extern getrandom=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libgetrandom-0840ff858fccc57a.rmeta --extern indicatif=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libindicatif-387c5d1912c4bf6a.rmeta --extern itertools=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libitertools-244b09eccdfc1a09.rmeta --extern lazy_static=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/liblazy_static-70f2c43a9ded1614.rmeta --extern log=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/liblog-868b1533bc35336a.rmeta --extern macro_rules_attribute=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libmacro_rules_attribute-7457d1e82e5afc40.rmeta --extern monostate=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libmonostate-02a71e617c2c010f.rmeta --extern onig=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libonig-32f51c4c1e73388c.rmeta --extern paste=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libpaste-e36699270d1a400e.so --extern rand=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/librand-c5a2a4aaba2fadc7.rmeta --extern rayon=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/librayon-d407a8180c10eeb6.rmeta --extern rayon_cond=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/librayon_cond-095b0ed7b7312de4.rmeta --extern regex=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libregex-079ed91e94dc338a.rmeta --extern regex_syntax=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libregex_syntax-c3ba2266c864a422.rmeta --extern reqwest=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libreqwest-7e6229659f846647.rmeta --extern serde=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libserde-7be8408fc50a07da.rmeta --extern serde_json=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libserde_json-af21dccbb02e514e.rmeta --extern spm_precompiled=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libspm_precompiled-3dde457deecc9f1c.rmeta --extern thiserror=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libthiserror-b1959683ffc698fa.rmeta --extern unicode_normalization_alignments=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libunicode_normalization_alignments-f1efa90d7fe69706.rmeta --extern unicode_segmentation=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libunicode_segmentation-ad70de3121ea8944.rmeta --extern unicode_categories=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libunicode_categories-436bd1ea0d9dd645.rmeta -L native=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/build/bzip2-sys-9f0cc99bf07a3f85/out/lib -L native=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/build/zstd-sys-99695ae48306d6f1/out -L native=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/build/esaxx-rs-4237ceb012f7dd82/out -L native=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/build/onig_sys-43450e8faa843f01/out`
warning: variable does not need to be mutable
--> tokenizers-lib/src/models/unigram/model.rs:265:21
|
265 | let mut target_node = &mut best_path_ends_at[key_pos];
| ----^^^^^^^^^^^
| |
| help: remove this `mut`
|
= note: `#[warn(unused_mut)]` on by default
warning: variable does not need to be mutable
--> tokenizers-lib/src/models/unigram/model.rs:282:21
|
282 | let mut target_node = &mut best_path_ends_at[starts_at + mblen];
| ----^^^^^^^^^^^
| |
| help: remove this `mut`
warning: variable does not need to be mutable
--> tokenizers-lib/src/pre_tokenizers/byte_level.rs:200:59
|
200 | encoding.process_tokens_with_offsets_mut(|(i, (token, mut offsets))| {
| ----^^^^^^^
| |
| help: remove this `mut`
error: casting `&T` to `&mut T` is undefined behavior, even if the reference is unused, consider instead using an `UnsafeCell`
--> tokenizers-lib/src/models/bpe/trainer.rs:526:47
|
522 | let w = &words[*i] as *const _ as *mut _;
| -------------------------------- casting happend here
...
526 | let word: &mut Word = &mut (*w);
| ^^^^^^^^^
|
= note: for more information, visit <https://doc.rust-lang.org/book/ch15-05-interior-mutability.html>
= note: `#[deny(invalid_reference_casting)]` on by default
warning: `tokenizers` (lib) generated 3 warnings
error: could not compile `tokenizers` (lib) due to 1 previous error; 3 warnings emitted
Caused by:
process didn't exit successfully: `rustc --crate-name tokenizers --edition=2018 tokenizers-lib/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts,future-incompat --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -C embed-bitcode=no --cfg 'feature="cached-path"' --cfg 'feature="clap"' --cfg 'feature="cli"' --cfg 'feature="default"' --cfg 'feature="dirs"' --cfg 'feature="esaxx_fast"' --cfg 'feature="http"' --cfg 'feature="indicatif"' --cfg 'feature="onig"' --cfg 'feature="progressbar"' --cfg 'feature="reqwest"' -C metadata=89b09084cb326b58 -C extra-filename=-89b09084cb326b58 --out-dir /tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps -C strip=debuginfo -L dependency=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps --extern aho_corasick=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libaho_corasick-8ba363174647299e.rmeta --extern cached_path=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libcached_path-6bfc0561b4dbd446.rmeta --extern clap=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libclap-540591df4bacc00c.rmeta --extern derive_builder=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libderive_builder-24b0e15fc888a13f.rmeta --extern dirs=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libdirs-9b2b450d525477fc.rmeta --extern esaxx_rs=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libesaxx_rs-e2b589202c958bcf.rmeta --extern getrandom=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libgetrandom-0840ff858fccc57a.rmeta --extern indicatif=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libindicatif-387c5d1912c4bf6a.rmeta --extern itertools=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libitertools-244b09eccdfc1a09.rmeta --extern lazy_static=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/liblazy_static-70f2c43a9ded1614.rmeta --extern log=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/liblog-868b1533bc35336a.rmeta --extern macro_rules_attribute=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libmacro_rules_attribute-7457d1e82e5afc40.rmeta --extern monostate=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libmonostate-02a71e617c2c010f.rmeta --extern onig=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libonig-32f51c4c1e73388c.rmeta --extern paste=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libpaste-e36699270d1a400e.so --extern rand=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/librand-c5a2a4aaba2fadc7.rmeta --extern rayon=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/librayon-d407a8180c10eeb6.rmeta --extern rayon_cond=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/librayon_cond-095b0ed7b7312de4.rmeta --extern regex=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libregex-079ed91e94dc338a.rmeta --extern regex_syntax=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libregex_syntax-c3ba2266c864a422.rmeta --extern reqwest=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libreqwest-7e6229659f846647.rmeta --extern serde=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libserde-7be8408fc50a07da.rmeta --extern serde_json=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libserde_json-af21dccbb02e514e.rmeta --extern spm_precompiled=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libspm_precompiled-3dde457deecc9f1c.rmeta --extern thiserror=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libthiserror-b1959683ffc698fa.rmeta --extern unicode_normalization_alignments=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libunicode_normalization_alignments-f1efa90d7fe69706.rmeta --extern unicode_segmentation=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libunicode_segmentation-ad70de3121ea8944.rmeta --extern unicode_categories=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/deps/libunicode_categories-436bd1ea0d9dd645.rmeta -L native=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/build/bzip2-sys-9f0cc99bf07a3f85/out/lib -L native=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/build/zstd-sys-99695ae48306d6f1/out -L native=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/build/esaxx-rs-4237ceb012f7dd82/out -L native=/tmp/pip-install-lfnqsm52/tokenizers_4a141a2d2c4a4feca24370febaf9c32b/target/release/build/onig_sys-43450e8faa843f01/out` (exit status: 1)
error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --` failed with code 101
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (tokenizers)
I face the same issue while installing requirements :
error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib -- -C 'link-args=-undefined dynamic_lookup -Wl,-install_name,@rpath/tokenizers.cpython-312-darwin.so'` failed with code 101
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (tokenizers)
This issue "shouldn't" happen if you are using python 3.10, the highest supported version for this project. As much as I wish we had a more up to date version of tokenizers and python compatible-- we don't.
Perhaps this is caused because pypi does not have prebuilt binaries for that old of a version of tokenizers for that new of a version of python?
Repost here if python 3.10.x doesn't fix it!
nope still 3.10.12 still have that bug idk what todo still tokenizer issue
same problem on ArchLinux
same problem on ArchLinux
fixed the tokenizer
this one mental abuse me for over a week
Commit hash: 82a973c04367123ae98bd9abdf80d9eda9b910e2
Launching Web UI with arguments: --listen --theme dark --skip-torch-cuda-test --no-half --use-cpu all --share
Traceback (most recent call last):
File "/workspace/file/stable-diffusion-webui/launch.py", line 48, in
This could be caused by incorrect rustc version?
I have issue with toenizers as well on my Arch Linux, in my error log it hints I should compile it with rustc 1.74.0 or later, but I'm using 1.70, which is installed by rustup package.
We have the same exit code (101), but I can't find similar rustc message in your error logs.
I'm trying to fix my own issue.
This could be caused by incorrect rustc version?
I have issue with toenizers as well on my Arch Linux, in my error log it hints I should compile it with rustc 1.74.0 or later, but I'm using 1.70, which is installed by rustup package. We have the same exit code (101), but I can't find similar rustc message in your error logs.
I'm trying to fix my own issue.
could be but I think it more likely a bug on the tokenizer code that complie with older rust so in newer ver it just broke maybe
This could be caused by incorrect rustc version?
I have issue with toenizers as well on my Arch Linux, in my error log it hints I should compile it with rustc 1.74.0 or later, but I'm using 1.70, which is installed by rustup package. We have the same exit code (101), but I can't find similar rustc message in your error logs.
I'm trying to fix my own issue.
I found out 1.65 work maybe but it break when I reload new model
Getting the same problem on my M1 Mac. Tried downgrading to rust 1.65 and python 3.10.12
error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib -- -C 'link-args=-undefined dynamic_lookup -Wl,-install_name,@rpath/tokenizers.cpython-312-darwin.so'` failed with code 101
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (tokenizers)
Any workarounds?
some problem on my M2 Max Mac, I slove this question use conda create env use python 3.10.15.
It looks original sources are already fixed (with unsafe
keyword)
So... is there a update of python libs? (I can't find what library uses this crate.)
Stuck with the same issue
Getting the same problem on my M1 Mac. Tried downgrading to rust 1.65 and python 3.10.12
error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib -- -C 'link-args=-undefined dynamic_lookup -Wl,-install_name,@rpath/tokenizers.cpython-312-darwin.so'` failed with code 101 [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for tokenizers ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (tokenizers)
Any workarounds?
M1 I think you need to have more packet btw that more likely a bug of ARM emulation
It looks original sources are already fixed (with
unsafe
keyword)So... is there a update of python libs? (I can't find what library uses this crate.)
no it just a really bug from python itself idk but I solve this using a older version of rust and ignore that bug
I'm using this condition:
I tried rustc 1.65.0, but it does not solve the problem.
Finally, I rewrite requirements_versions.txt
and change transformers version to 4.34.0
(from 4.30.2
).
transformers==4.34.0
Thus, I think it is neither a bug of python or rustc.
Sorry, also you need this patch, because of the update of transformer:
https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/13245#issuecomment-1766452012
Hit this issue when I accidentally used the wrong python version myself.
you absolutely need to be on 3.10.6.
if your build is failing due to unsafe rust code, then you likely are using the wrong python runtime. But the issue you're hitting is actually over here: https://github.com/huggingface/tokenizers/issues/1485
Time continues to move forward. Libraries that the dependencies in requirements.txt depend on release new versions. Pip wanders into them. If you were using poetry instead of bare pip, then at least you could check in the lock file with the sum total of deps and their deps which would probably make for a more reproducible build. But when you mix and match major python versions, all bets are off.
pyenv to the rescue.
You know what? Running this out of a docker image would be pretty straight forward and might cut down on the finicky python env sensitivity. Just a thought.
Checklist
What happened?
can't open because of the tokenizers can't build
Steps to reproduce the problem
idk it just pop up
What should have happened?
it run as usual but now it bonked
What browsers do you use to access the UI ?
No response
Sysinfo
from my console can't open the webUI
Console logs
Additional information
No response