spotify / XCRemoteCache

Other
838 stars 52 forks source link

M1: `The file “x86_64” couldn’t be opened because there is no such file.` (both consumer and producer mode, building for simulator) #33

Closed imWildCat closed 2 years ago

imWildCat commented 2 years ago

Hello, I'm seeing this error while building a demo project on M1 Mac.

Anything I can do to fix it?

Demo project: https://gitlab.com/imWildCat/demoxcoderemotecache2

Error Group
Postbuild step failed Error Domain=NSCocoaErrorDomain Code=260 "The file “x86_64” couldn’t be opened because there is no such file." UserInfo={NSURL=file:///Users/wildcat/Library/Developer/Xcode/DerivedData/DemoXcodeRemoteCache2-etmnuhcygwlwkyctqaxtawrbenyo/Build/Intermediates.noindex/DemoXcodeRemoteCache2.build/Debug-iphonesimulator/DemoXcodeRemoteCache2.build/Objects-normal/x86_64, NSFilePath=/Users/wildcat/Library/Developer/Xcode/DerivedData/DemoXcodeRemoteCache2-etmnuhcygwlwkyctqaxtawrbenyo/Build/Intermediates.noindex/DemoXcodeRemoteCache2.build/Debug-iphonesimulator/DemoXcodeRemoteCache2.build/Objects-normal/x86_64, NSUnderlyingError=0x120e39610 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}}
polac24 commented 2 years ago

Is it possible that your producer was built on an Intel machine? Then the artifact contains only x86_64 architecture.

I scanned our repo and we nowhere reference x86 outside of tests, so that string probably comes from the project's configuration.

I would first inspect meta files on a remote server - maybe there is the same problem as reported in #36?

imWildCat commented 2 years ago

@polac24 I don't think it is related to existing cache files. Because I cannot build the project even using producer mode. It is the same error: The file “x86_64” couldn’t be opened because there is no such file.

I switched to another clean bucket (MinIO) but the issue still exists:

---
cache_addresses:
- http://192.168.3.187:9000/test2
imWildCat commented 2 years ago

Update: I select my iPhone device as the target to run and it can be built

https://user-images.githubusercontent.com/2396817/145596629-c8e425a3-d2d8-4964-b806-eaf9a2eec92c.mov

But when I switched back to the simulator, the error came up again.

Maybe it is related to hard-coded instruction set of simulator?

Update 2:

image
po archs
▿ 1 element
  - 0 : "x86_64-apple-ios-simulator"

on M1 Mac, it should have more value? At least, it should have something like arm_64-apple-ios-simulator?

polac24 commented 2 years ago

BTW. Thinning plugin is an experimental and not fully supported plugin. Did you turn it on with thinning_enabled: true? By default, it should be off.

The second screenshot is taken from the consumer side and suggests that the producer was run on an Intel machine. The simplest way to not mismatch Intel and AppleSilicon artifacts is to add PLATFORM_PREFERRED_ARCH to custom_fingerprint_envs in your .rcinfo, like:

custom_fingerprint_envs: 
  - PLATFORM_PREFERRED_ARCH

Please add that both for the consumer and producer side, so adding it to the shared .rcinfo is the best solution.

imWildCat commented 2 years ago

I didn't enable thinning_enabled...

After adding custom_fingerprint_envs, the same error still exists: https://gitlab.com/imWildCat/demoxcoderemotecache2/-/commit/88ea15c5eb7bd6f974b7c4f644178bb3ba47cf88

polac24 commented 2 years ago

OK, I finally managed to access M1 machine and indeed, the default flow is broken. At least I found the reason. As a workaround, I can suggest disabling "Build Archive Architecture Only" - to set "No" to both Debug and Release for the producer. This will generate an artifact that will have both arm64 and x86_64 binaries. This is actually something recommended when your consumers are on both Intel and M1.

FYI: The problem is that we use PLATFORM_PREFERRED_ARCH to find a "default" platform (either arm64 or x86). For M1 it still resolves tox86_64. We can instead look for NATIVE_ARCH, which resolves to arm64 on M1 and `x86_64 on Intel. The fix should be easy (search and replace ENV name everywhere in the repo)

imWildCat commented 2 years ago

Thank you so much for the help! Look forward to the fix!