Closed kettle11 closed 2 years ago
does it work for C/C++ invocations for you? are you using the latest version of zld?
C/C++ as in, clang with -fuse-ld
Yes, I'm using the latest version. This is the result of zld -v
BUILD 19:53:21 Mar 5 2021 configured to support archs: i386 x86_64 x86_64h armv6 armv7 armv7s armv7m armv7k arm64 LTO support using: LLVM version 12.0.5, (clang-1205.0.22.6) (static support for 23, runtime is 27) TAPI support using: Apple TAPI version 12.0.5 (tapi-1205.0.7.1)
I'll try it out with C/C++ and report back.
Following up:
A simple 'hello world' works for both C and C++ with zld
. This is the C++ invocation:
clang++ hello.cpp -fuse-ld=/Users/ian/.cargo/zld
In addition Rust programs that do not use build.rs
at all in their dependency tree also work.
gotcha. can you write out a full example of how to trigger this crash here? (unfortunately, i don't own an M1 mac, but maybe i can wrangle someone in to helping, or use an instance in the cloud)
@michaeleisel Here's a repository with the minimum needed to demonstrate the crash: https://github.com/kettle11/rust_zld_crash
I'd help more here, but I haven't dived into what's causing the issue. There's always a chance it's somehow local to my machine, but based on that comment I linked earlier I think that's not the case.
Let me know if you need me to test anything!
I wonder if it's the misaligned reads in CRCHash
, which arm64 might not like so much. You might try removing everything in that function but this piece:
size_t __h = 0;
for ( ; *__s; ++__s)
__h = 5 * __h + *__s;
return __h;
and see if it works. just a longshot if you want to try it
I wonder if it's the misaligned reads in CRCHash, which arm64 might not like so much. You might try removing everything in that function but this piece:
I tried this, but get the same behavior: https://github.com/michaelkirk/zld/commit/a68631f681bff258413b0c2c745d101299343267
error: failed to run custom build command for `libc v0.2.94`
Caused by:
process didn't exit successfully: `/Users/mkirk/src/abs/abstreet/target/debug/build/libc-5f8b74cb6ae41598/build-script-build` (signal: 9, SIGKILL: kill)
warning: build failed, waiting for other jobs to finish...
error: build failed
edit: for completeness, I get the same error on the POC app posted above:
$ cargo build
Compiling rust_zld_crash v0.1.0 (/Users/mkirk/src/zld/rust_zld_crash)
error: failed to run custom build command for `rust_zld_crash v0.1.0 (/Users/mkirk/src/zld/rust_zld_crash)`
Caused by:
process didn't exit successfully: `/Users/mkirk/src/zld/rust_zld_crash/target/debug/build/rust_zld_crash-678aad67ab01c41a/build-script-build` (signal: 9, SIGKILL: kill)
Maybe OOM somehow? nm, that just means we're being sigkilled, which we already knew.
$ target/debug/build/rust_zld_crash-678aad67ab01c41a/build-script-build
zsh: killed target/debug/build/rust_zld_crash-678aad67ab01c41a/build-script-build
$ echo $?
137
Ok, maybe more interesting, it appears to be an issue with code signing:
(from Console.app)
Process: build-script-build [70344]
Path: /Users/USER/*/build-script-build
Identifier: build-script-build
Version: ???
Code Type: ARM-64 (Native)
Parent Process: cargo [70325]
Responsible: Terminal [582]
User ID: 502
Date/Time: 2021-05-24 17:33:32.913 -0700
OS Version: macOS 11.3.1 (20E241)
Report Version: 12
Anonymous UUID: B0504E5D-F5F6-8FBB-2F3C-B1D675FD58D7
Sleep/Wake UUID: 2A5FD03F-D8B7-4863-A777-A72BA3B284BE
Time Awake Since Boot: 420000 seconds
Time Since Wake: 370000 seconds
System Integrity Protection: enabled
Crashed Thread: Unknown
Exception Type: EXC_BAD_ACCESS (Code Signature Invalid)
Exception Codes: 0x0000000000000032, 0x00000001049b8000
Exception Note: EXC_CORPSE_NOTIFY
Termination Reason: Namespace CODESIGNING, Code 0x2
kernel messages:
VM Regions Near 0x1049b8000:
--> mapped file 1049b8000-1049f0000 [ 224K] r-x/r-x SM=COW Object_id=37e2c877
mapped file 1049f0000-1049f8000 [ 32K] rw-/rw- SM=COW Object_id=37e2c877
Backtrace not available
Unknown thread crashed with ARM Thread State (64-bit):
x0: 0x00000001049b8000 x1: 0x000000016b446ee8 x2: 0x000000016b446e98 x3: 0x0000000104dddf03
x4: 0x0000000000000070 x5: 0x0000000000000073 x6: 0x000000016b446c40 x7: 0x00000000000005f0
x8: 0x000000016b446ec8 x9: 0x000000016b446ea8 x10: 0x0000000104e02a78 x11: 0x0000000000000000
x12: 0x0000000000000000 x13: 0x000000000000002d x14: 0x0000000000000000 x15: 0x0000000000000000
x16: 0x0000000104db2088 x17: 0x6ae100016b446e98 x18: 0x0000000000000000 x19: 0x000000016b446ee8
x20: 0x00000001049b8000 x21: 0x000000016b446e98 x22: 0x00000001049b8000 x23: 0x000000016b447048
x24: 0x000000016b447278 x25: 0x000000016b446f80 x26: 0x0000000000000000 x27: 0x0000000000000000
x28: 0x000000016b447028 fp: 0x000000016b446e80 lr: 0xf51d000104db2050
sp: 0x000000016b446e00 pc: 0x0000000104db4474 cpsr: 0x80000000
far: 0x00000001049b8000 esr: 0x92000007
Binary images description not available
External Modification Summary:
Calls made by other processes targeting this process:
task_for_pid: 0
thread_create: 0
thread_set_state: 0
Calls made by this process:
task_for_pid: 0
thread_create: 0
thread_set_state: 0
Calls made by all processes on this machine:
task_for_pid: 323
thread_create: 0
thread_set_state: 122
VM Region Summary:
Writable regions: Total=8384K written=0K(0%) resident=0K(0%) swapped_out=0K(0%) unallocated=8384K(100%)
VIRTUAL REGION
REGION TYPE SIZE COUNT (non-coalesced)
=========== ======= =======
STACK GUARD 56.0M 1
Stack 8176K 1
VM_ALLOCATE 1.0G 2
VM_ALLOCATE (reserved) 16K 1 reserved VM address space (unallocated)
mapped file 1248K 7
=========== ======= =======
TOTAL 1.1G 12
TOTAL, minus reserved VM space 1.1G 12
nice, yeah code signing validation fails with codesign
, i can take a look
Let me know if you'd like me to try anything.
there appears to be a bug due to some interaction between the file system and codesigning validation. for instance, doing cp binary /tmp/bin && mv /tmp/bin binary
will make ./binary
suddenly work. another fix is apparently disabling mmap
file writing and using write
instead, so that's what i've done. feel free to try out the signing-fix
branch and let me know how it goes
Thanks for looking into it @michaeleisel.
Using the signing-fix
branch, I was able to build the demo app and my own app, so that's great.
I didn't do very thorough testing, but there appeared to be no (or only very small) speedups in incremental build times while using zld. Honestly, I was hoping it'd be a bit more.
Do you expect the changes you made to work around the code signing issue might have significantly degraded performance?
## Incremental Builds without zld
> touch widgetry/src/input.rs && time cargo build
cargo build 109.56s user 8.53s system 345% cpu 34.158 total
> touch widgetry/src/input.rs && time cargo build
cargo build 108.96s user 8.22s system 345% cpu 33.874 total
> touch widgetry/src/input.rs && time cargo build
cargo build 107.83s user 8.36s system 348% cpu 33.352 total
## Incremental Builds w/ zld
> touch widgetry/src/input.rs && time cargo build
cargo build 105.43s user 8.74s system 350% cpu 32.579 total
> touch widgetry/src/input.rs && time cargo build
cargo build 106.60s user 8.67s system 357% cpu 32.264 total
> touch widgetry/src/input.rs && time cargo build
cargo build 104.66s user 8.56s system 351% cpu 32.232 total
no, that change has little perf effect (in fact, it actually sped it up by ~3% for me). but if you can provide me a way of doing the linking myself, i can investigate
@michaelkirk It is possible that the vast majority of the time is spent inside rustc and not the linker. If you are on nightly you can try cargo rustc -- -Ztime-passes
to see how much time the linker took. I believe the entry is called something like link_binary
. You could also do cargo rustc -- -Csave-temps -Clinker=false
to get an error message with all arguments passed to the linker so you can run it yourself. The -Csave-temps
ensures that rustc doesn't remove any temporary object files necessary for linking.
Thanks for the debugging instructions @bjorn3. It seems it's probably a separate issue, so I can follow up with a new issue after investigation.
I'm away for a bit, but will investigate further in ~3 days.
Thanks again for working around the code signing issue @michaeleisel.
Just wanted to note that apparently on MacOS 12.0.1 on M1 and M1Pro (so probably it's not about the chipset), code signing seems to be incorrect or no signature was found the second time the build result changes. incremental = false
didn't seem to influence that.
The reason I had to resort to an alternative linker in the first place was link failures with the system linker in any of my projects that pulled in curl-sys.
Now I am resorting to a nix-pkg provided linker by adjusting cargo's configuration like so
.cargo/config.toml
[build]
rustc-wrapper = "/Users/byron/dev/github.com/Byron/depot/rustc.nix.sh"
Which then launches a nix shell with all kinds of useful dependencies made available, which apparently also causes a different linker to be used.
#! /usr/bin/env nix-shell
#! nix-shell -i bash -p pkg-config openssl libiconv darwin.apple_sdk.frameworks.Security darwin.apple_sdk.frameworks.SystemConfiguration darwin.apple_sdk.frameworks.Foundation darwin.apple_sdk.frameworks.AppKit curl libgpgerror gpgme
$@
This adds 500ms to each rustc invocation at the very least but works for now.
Maybe others find this place with similar experiences so we can figure out an actual solution.
It's probably using the default linker, ld
. Do you have minimal steps to reproduce?
I think I do, here is my configuration for ZLD overrides, using the latest downloadable x86 version.
[target.aarch64-apple-darwin]
rustflags = ["-C", "link-arg=-fuse-ld=/usr/local/bin/zld"]
Then this should work to reproduce, unfortunately it involves manual steps:
# on MacOS 12.0.1 with latest XCode, might be important if there is a chance the system linker is at fault.
# It might very well be because it's downright broken and I basically can't link most Rust projects with it anymore.
git clone https://github.com/rust-lang/cargo
cd cargo
cargo test
# the above works
# now edit a test file like required_features.rs by adding a newline in a test or modifying it slighly.
cargo test
Running tests/testsuite/main.rs (target/debug/deps/testsuite-e43ded086d2ec039)
error: test failed, to rerun pass '--test testsuite'
Caused by:
process didn't exit successfully: `/Users/byron/dev/github.com/rust-lang/cargo/target/debug/deps/testsuite-e43ded086d2ec039` (signal: 9, SIGKILL: kill)
It might not happen after the first edit, I needed two. Here is what the system log has to say about that:
Sending event: com.apple.stability.crash {"exceptionCodes":"0x0000000000000032, 0x0000000104b10000(\n 50,\n 4373676032\n)EXC_BAD_ACCESSSIGKILL (Code Signature Invalid)UNKNOWN_0x32 at 0x0000000104b10000","incidentID":"5FF0B1A6-E850-4E16-8C70-478012B8CE96","logwritten":1,"process":"testsuite-e43ded086d2ec039","terminationReasonExceptionCode":"0x2","terminationReasonNamespace":"CODESIGNING"}
It could probably reproduce in other projects as well, but this one definitely shows the issue.
Thanks a lot for taking a look, I will be happy to help with testing as much as I can. Unfortunately I am unable to build zld myself on ARM64.
I can confirm I have a SIGKILL issue when running buildscripts on Mac and it's enough for me to disable zld to make this issue go away. All binaries seem to be arm64 ones, zld installed via Homebrew.
Some diagnostics info:
$ file $(which rustup)
$HOME/.cargo/bin/rustup: Mach-O 64-bit executable arm64
$ file $(which cargo)
$HOME/.cargo/bin/cargo: Mach-O 64-bit executable arm64
$ file $(which rustc)
$HOME/.cargo/bin/rustc: Mach-O 64-bit executable arm64
$ file $(rustup which cargo)
$HOME/.rustup/toolchains/stable-aarch64-apple-darwin/bin/cargo: Mach-O 64-bit executable arm64
$ file $(rustup which rustc)
$HOME/.rustup/toolchains/stable-aarch64-apple-darwin/bin/rustc: Mach-O 64-bit executable arm64
$ rustc --version
rustc 1.56.1 (59eed8a2a 2021-11-01)
$ /opt/homebrew/bin/zld --help
ld64: For information on command line options please use 'man ld'.
$ file /opt/homebrew/bin/zld
/opt/homebrew/bin/zld: Mach-O 64-bit executable arm64
Sample project I've used: any project with cargo new
and log
dependecy in it, for example:
$ cat test-proj/Cargo.toml
[package]
name = "test-proj"
version = "0.1.0"
edition = "2021"
[dependencies]
log = "0.4.14"
Then, if I have zld enabled, the compilation fails:
$ cat $HOME/.cargo/config
[target.aarch64-apple-darwin]
rustflags = ["-C", "link-arg=-fuse-ld=/opt/homebrew/bin/zld"]
$ cargo clean && cargo build
Compiling log v0.4.14
Compiling cfg-if v1.0.0
error: failed to run custom build command for `log v0.4.14`
Caused by:
process didn't exit successfully: `/tmp/test-proj/target/debug/build/log-ff09eda2e3e0aba6/build-script-build` (signal: 9, SIGKILL: kill)
When I disable it and restart the shell, it works:
$ cat $HOME/.cargo/config
#[target.aarch64-apple-darwin]
#rustflags = ["-C", "link-arg=-fuse-ld=/opt/homebrew/bin/zld"]
$ cargo clean && cargo build
Compiling log v0.4.14
Compiling cfg-if v1.0.0
Compiling test-proj v0.1.0 (/tmp/test-proj)
Finished dev [unoptimized + debuginfo] target(s) in 0.68s
@Byron @SomeoneToIgnore could this be due to gatekeeper? When you run ./zld
, does Apple complain that it's an untrusted executable?
I suspect it could be, but alas have no idea how to verify it exactly.
Running the binary directly seem to work:
$ /opt/homebrew/bin/zld
ld: warning: platform not specified
ld: warning: -arch not specified
ld: warning: No platform min-version specified on command line
ld: no object files specified
@SomeoneToIgnore are you on the latest release, 1.3.3?
Oh, that's a good catch, brew package seems to be outdated:
$ brew info zld
michaeleisel/zld/zld: stable 1.3.1
A faster version of ld, Apple's linker
https://github.com/michaeleisel/zld
/opt/homebrew/Cellar/zld/1.3.1 (5 files, 2.7MB) *
Built from source on 2021-11-17 at 16:46:47
From: https://github.com/michaeleisel/homebrew-zld/blob/HEAD/Formula/zld.rb
==> Dependencies
Build: cmake ✔
$ brew update
Already up-to-date.
$ brew install michaeleisel/zld/zld
Warning: michaeleisel/zld/zld 1.3.1 is already installed and up-to-date.
To reinstall 1.3.1, run:
brew reinstall zld
So, I guess somebody has to update it in their repo?
Had tried the binary from the releases page and it seems to work (I had to force it to be opened for the first time, as with any other software downloaded from the internet on modern macs).
Thanks for looking into this. I was using the latest version from the releases page, zld -v
prints:
@(#)PROGRAM:zld PROJECT:zld-
BUILD 13:50:05 Aug 23 2021
configured to support archs: i386 x86_64 x86_64h armv6 armv7 armv7s armv7m armv7k arm64
LTO support using: LLVM version 13.0.0, (clang-1300.0.29.3) (static support for 23, runtime is 27)
TAPI support using: Apple TAPI version 13.0.0 (tapi-1300.0.6.5)
To be sure, I just tried again and after a first successful build, the next one caused invalid binaries to be created.
➜ open-rs git:(main) ✗ cargo run .
Compiling open v2.0.1 (/Users/byron/dev/github.com/Byron/open-rs)
Finished dev [unoptimized + debuginfo] target(s) in 0.31s
Running `target/debug/open .`
[1] 24503 killed cargo run .
To me, nothing changed, unfortunately.
I will keep trying as an update of the Xcode command-line utilities is coming in, maybe that changes something.
Just to recap:
/usr/bin/ld
fails with an assertion error during linking.LC_DYLD_INFO_ONLY
, which apparently has disappeared in favor of a new format, somehow.So have to go back to the only fix that worked for me, a nix wrapper which unfortunately slows down every command invocation by at least 0.5s.
I'm still seeing the SIGKILL caused by an invalid code signature in zld 1.3.3 installed from Homebrew on an M1 Mac.
It seems to be pretty reproducible using https://github.com/LukeMathWalker/zero-to-production/tree/7ed962996b5defb74c7b37d58c9a2d9d8591ccb2, although for a Homebrew-installed version of zld you need to fix the path in .cargo/config.toml
to use /opt/homebrew/bin
rather than /usr/local/bin
. I've not yet narrowed down exactly when it starts breaking, but if you use cargo watch -x check -x test -x run
as suggested by the book the repo is for and make a couple of file saves in quick succession, either the test binary or the actual binary will start being SIGKILLed.
I'm also getting a SIGKILL for a package w/ a build.rs (in my case, serde-derive) on version 1.3.3 installed by Homebrew. M1 Max MacBook Pro, macOS 12.2.1.
Error snippet below:
error: failed to run custom build command for `serde_derive v1.0.136`
Caused by:
process didn't exit successfully: `project/target/debug/build/serde_derive-010920a940d7335a/build-script-build` (signal: 9, SIGKILL: kill)
warning: build failed, waiting for other jobs to finish...
error: build failed
cargo build 0.44s user 0.19s system 14% cpu 4.222 total
I've read through the thread and don't have any ideas yet. Tried and failed to dtrace it before finding the thread.
could people try doing cp zld /path/somewhere/else && rm zld && cp /path/somewhere/else zld
?
next, if that doesn't work, could people show me the result of codesign -dvvvvv binary
when built by zld vs. built by ld64?
could people try doing cp zld /path/somewhere/else && rm zld && cp /path/somewhere/else zld?
Yeah, here was the result:
error: could not compile `actix-service`
Caused by:
process didn't exit successfully: `rustc --crate-name actix_service --edition=2018 /Users/chrisa/.cargo/registry/src/github.com-1ecc6299db9ec823/actix-service-2.0.2/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C embed-bitcode=no -C split-debuginfo=unpacked -C debuginfo=2 -C metadata=d02005f19cc9ef31 -C extra-filename=-d02005f19cc9ef31 --out-dir /Users/chrisa/.cargo/cache/debug/deps -L dependency=/Users/chrisa/.cargo/cache/debug/deps --extern futures_core=/Users/chrisa/.cargo/cache/debug/deps/libfutures_core-d91717efba0b424b.rmeta --extern paste=/Users/chrisa/.cargo/cache/debug/deps/libpaste-52f8c6a153201d87.dylib --extern pin_project_lite=/Users/chrisa/.cargo/cache/debug/deps/libpin_project_lite-bbb6b622f0b47178.rmeta --cap-lints allow -C link-arg=-fuse-ld=/opt/homebrew/bin/zld` (signal: 9, SIGKILL: kill)
warning: build failed, waiting for other jobs to finish...
error: build failed
make: *** [build] Error 101
next, if that doesn't work, could people show me the result of codesign -dvvvvv binary when built by zld vs. built by ld64?
Not sure what binary you mean here, I can't check the code-signing of the resultant binary if it doesn't build with zld. Here's zld though:
╰─ codesign -dvvvvv `which zld`
Executable=/opt/homebrew/bin/zld
Identifier=zld
Format=Mach-O thin (arm64)
CodeDirectory v=20400 size=22044 flags=0x20002(adhoc,linker-signed) hashes=686+0 location=embedded
VersionPlatform=1
VersionMin=720896
VersionSDK=786688
Hash type=sha256 size=32
CandidateCDHash sha256=741c44d63c1cf505e640fad24cf41546cd10b438
CandidateCDHashFull sha256=741c44d63c1cf505e640fad24cf41546cd10b4383e415ed8eca0df6c305d9d4a
Hash choices=sha256
CMSDigest=741c44d63c1cf505e640fad24cf41546cd10b4383e415ed8eca0df6c305d9d4a
CMSDigestType=2
Executable Segment base=0
Executable Segment limit=1671168
Executable Segment flags=0x1
Page size=4096
CDHash=741c44d63c1cf505e640fad24cf41546cd10b438
Signature=adhoc
Info.plist=not bound
TeamIdentifier=not set
Sealed Resources=none
Internal requirements=none
Here's the codesign output for a binary built without zld:
Executable=/Users/chrisa/.cargo/cache/debug/project-cli
Identifier=project_cli-b6c354d0025666ed
Format=Mach-O thin (arm64)
CodeDirectory v=20400 size=566456 flags=0x20002(adhoc,linker-signed) hashes=17698+0 location=embedded
VersionPlatform=1
VersionMin=786432
VersionSDK=786688
Hash type=sha256 size=32
CandidateCDHash sha256=6fe9e9d497a613a22f1504cd9e54933bccef466d
CandidateCDHashFull sha256=6fe9e9d497a613a22f1504cd9e54933bccef466dabf37e62ec329ad603d1b1be
Hash choices=sha256
CMSDigest=6fe9e9d497a613a22f1504cd9e54933bccef466dabf37e62ec329ad603d1b1be
CMSDigestType=2
Executable Segment base=0
Executable Segment limit=25739264
Executable Segment flags=0x1
Page size=4096
CDHash=6fe9e9d497a613a22f1504cd9e54933bccef466d
Signature=adhoc
Info.plist=not bound
TeamIdentifier=not set
Sealed Resources=none
Internal requirements=none
project-cli
is a redaction btw.
could people try doing
cp zld /path/somewhere/else && rm zld && cp /path/somewhere/else zld
?next, if that doesn't work, could people show me the result of
codesign -dvvvvv binary
when built by zld vs. built by ld64?
❯ codesign -dvvvvv target/debug/deps/health_check-d0dd8af889f054b1 ✘ KILL
Executable=/Users/jonathan.grahl/projects/zero2prod/target/debug/deps/health_check-d0dd8af889f054b1
Identifier=health_check-d0dd8af889f054b1
Format=Mach-O thin (arm64)
CodeDirectory v=20400 size=348342 flags=0x20002(adhoc,linker-signed) hashes=10882+0 location=embedded
VersionPlatform=1
VersionMin=786432
VersionSDK=786432
Hash type=sha256 size=32
CandidateCDHash sha256=dc95a83c2cc9ead5199cbd2970144a62e35fcc79
CandidateCDHashFull sha256=dc95a83c2cc9ead5199cbd2970144a62e35fcc795027ff2e985e2d94e8d72e1c
Hash choices=sha256
CMSDigest=dc95a83c2cc9ead5199cbd2970144a62e35fcc795027ff2e985e2d94e8d72e1c
CMSDigestType=2
Page size=4096
CDHash=dc95a83c2cc9ead5199cbd2970144a62e35fcc79
Signature=adhoc
Info.plist=not bound
TeamIdentifier=not set
Sealed Resources=none
Internal requirements=none
❯ codesign -dvvvvv target/debug/deps/health_check-d0dd8af889f054b1
Executable=/Users/jonathan.grahl/projects/zero2prod/target/debug/deps/health_check-d0dd8af889f054b1
Identifier=health_check-d0dd8af889f054b1
Format=Mach-O thin (arm64)
CodeDirectory v=20400 size=348214 flags=0x20002(adhoc,linker-signed) hashes=10878+0 location=embedded
VersionPlatform=1
VersionMin=786432
VersionSDK=786432
Hash type=sha256 size=32
CandidateCDHash sha256=85af78ac3b288b9cd78d0c4d54b849991afd87ce
CandidateCDHashFull sha256=85af78ac3b288b9cd78d0c4d54b849991afd87ce83643645d8e3fc3cc78de16a
Hash choices=sha256
CMSDigest=85af78ac3b288b9cd78d0c4d54b849991afd87ce83643645d8e3fc3cc78de16a
CMSDigestType=2
Executable Segment base=0
Executable Segment limit=15990784
Executable Segment flags=0x1
Page size=4096
CDHash=85af78ac3b288b9cd78d0c4d54b849991afd87ce
Signature=adhoc
Info.plist=not bound
TeamIdentifier=not set
Sealed Resources=none
Internal requirements=none
So @Byron to your point, if ld64 fails for something, as you say with the assertion failure, it's not a priority to support for zld
@JammyStuff that project seems cumbersome to build, requiring a running postgresql server and docker for the build step itself. To be honest, that seems like a rare and undesirable way of building things, and I'm less concerned about supporting it. Do you have a more orthodox example?
@bitemyapp I made a test project with serde_derive
and was able to repro it the first time, but then it worked on all subsequent runs.
Is everyone using zld 1.3.3 (e.g., from the release page)? Does anyone have a project that can reliably reproduce this? And can people help me understand how much this is affecting their workflow, so I know how to prioritize it?
@n8henrie has provided me in https://github.com/michaeleisel/zld/issues/110 with an excellent reproducible example, and i have merged a fix to master that works for me. basically there's a bug somewhere in the filesystem or in the codesigning validation, and we have to keep trying different ways of jiggling around the output file until apple likes it. in this case, i tried using clonefile
to make a clone, and it works for me. please try it (it's merged into master) and let me know if it works for you, and i'll cut a release.
built version: zld.zip
Seems to be working great. No longer crashing in the nix
environment, I'll keep testing with the project from yesterday but that's a really good sign. For others not as familiar with the codesigning dance, you should expect to get an error like this when you try to run the downloaded zld:
Two workarounds:
$ cd ~/.local/bin
$ unzip ~/Downloads/zld.zip -d .
$ ./zld # note that you get a warning popup if you try to execute
$ xattr -l ./zld # note the quarantine flag
com.apple.quarantine: 0081;6272a68e;Firefox;F529D2ED-995E-4369-B9BA-8E84F261D88E
$ xattr -d com.apple.quarantine ./zld
$ ./zld # now runs without issue
ld: warning: platform not specified
ld: warning: -arch not specified
ld: warning: No platform min-version specified on command line
ld: no object files specified
To undo the codesigning "approval" (if you ever wish to do so)
$ spctl --list | grep zld # note the exception is there
$ spctl --remove ~/.local/bin/zld
Released in 1.3.4
I'd love to help releasing that to homebrew, but not sure where does https://github.com/michaeleisel/zld/archive/refs/heads/homebrew-fixes-1.3.3.zip
archive come from and where's its 1.3.4 version.
I was trying to use
zld
on an M1 Mac for those sweet link time improvements, but I'm running into this crash while building:It seems any crate with a build.rs file hits this while building.
@benmkw mentioned this issue as well: https://github.com/michaeleisel/zld/issues/73#issuecomment-781385342
Unfortunately this makes zld unusable for most projects with Rust / M1 Macs.