electron-userland / electron-builder

A complete solution to package and build a ready for distribution Electron app with “auto update” support out of the box
https://www.electron.build
MIT License
13.64k stars 1.74k forks source link

MacOS x86 trying to build for M1 Arm, ENOENT zip error #6821

Closed varun-dc closed 2 years ago

varun-dc commented 2 years ago

I'm trying to build an arm64 Electron application on an x84 CircleCI build MacOS VM. I can see that it does manage to compile the application,

$ ls -l /Users/distiller/project/release/mac-arm64
Deep Channel.app

But it continues (I've tried multiple times) to fail after with an ENOENT error. I can confirm that zip is indeed present on the VM at /usr/bin/zip. I also at one point saw a similar ENOENT error for ditto, at that time I was playing around with various settings to see if something would resolve it so I don't know what combination of settings resulted in the error for ditto.

$ node_modules/.bin/electron-builder --publish always --mac --config configs/electron-builder/prod.m1.json
  • electron-builder  version=22.14.13 os=20.6.0
  • loaded configuration  file=/Users/distiller/project/configs/electron-builder/prod.m1.json
  • rebuilding native dependencies  dependencies=leveldown@5.6.0, node-pty@0.10.0-beta18, sqlite3@4.2.0 platform=darwin arch=arm64
  • packaging       platform=darwin arch=arm64 electron=17.4.0 appOutDir=release/mac-arm64
  • signing         file=release/mac-arm64/My App.app identityName=Developer ID Application: My Company, Inc. (7JR23HRN74) identityHash=A21F390789D3B5AEBC7F8A517BE44A0B21709536 provisioningProfile=none
Error: spawn zip ENOENT
    at Process.ChildProcess._handle.onexit (node:internal/child_process:282:19)
    at onErrorNT (node:internal/child_process:477:16)
    at processTicksAndRejections (node:internal/process/task_queues:83:21)

The configs/electron-builder/prod.m1.json looks like this,

{
  "productName": "My App",
  "copyright": "Copyright © 2022 My Company, Inc.",
  "appId": "com.myCompany.myApp",
  "files": [
    "resources/",
    "app/renderer/dist/",
    "app/main/ui/views/prod.html",
    "app/main/ui/views/preload.js",
    "app/main/dist/",
    "package.json"
  ],
  "asarUnpack": [
    "resources/icon.png"
  ],
  "mac": {
    "entitlements": "./build/entitlements.mac.plist",
    "entitlementsInherit": "./build/entitlements.mac.plist",
    "gatekeeperAssess": false,
    "hardenedRuntime": true,
    "target": {
      "target": "default",
      "arch": ["arm64"]
    }
  },
  "dmg": {
    "sign": false,
    "contents": [
      {
        "x": 130,
        "y": 220
      },
      {
        "x": 410,
        "y": 220,
        "type": "link",
        "path": "/Applications"
      }
    ]
  },
  "directories": {
    "buildResources": "resources",
    "output": "release"
  },
  "publish": {
    "provider": "s3",
    "bucket": "my-bucket",
    "path": "PATH",
    "acl": null
  },
  "afterSign": "internals/scripts/notarize.js"
}
varun-dc commented 2 years ago

I solved the issue. In my case the follow things were relevant

I'm trying to build on CircleCI for context.

If you're using a self hosted runner, you need to make sure you're adding a,

  <key>SessionCreate</key>
  <true/>

to your launchd plist configuration. Without this, I was seeing errSecInternalComponent when I ran my build headlessly, yet if I VNC/RDP into the machine or run on my local physical mac (both cases I have a graphical interface, not a headless command line) I don't get the errSecInternalComponent error when the codesign is run to sign the application. See also


My application was also being notarized using,

"afterSign": "./internals/scripts/notarize.js"

in my electron builder config json

I was building for both architectures, arm64 and x64,

    "target": {
      "target": "default",
      "arch": ["x64", "arm64"]
    }

So what was happening is the script would get called two times during a build, once when x64 architecture was built and then when the arm64 was built -- not sure in which order the two occur, so I added in some safe guards to be order independent in my script. But the script was written assuming it would be called only once, so it was expecting arm64/x64 to be built already. What ended up happening is internally in,

const { notarize } = require("electron-notarize");

...

  await notarize({
    appBundleId: "...",
    appPath: "...",
    appleId: process.env.APPLE_ID,
    appleIdPassword: process.env.APPLE_ID_PASS,
  });

it would attempt to access the appPath for x64/arm64, but it wouldn't be there, resulting in the weird ENOENT zip/ENOENT ditto error. It had nothing to do with zip/ditto not being installed or available on the system.

Solution was to check if the architecture I was building was built each time the script was called and notarizing the one that was built and ready. And some additional checks to ensure I didn't renotarize the previous one again second time (by writing a file to disk to keep track of things)


The last thing I can mention is ensuring you've got, <key>com.apple.security.cs.allow-jit</key> in your entitlements xml, otherwise the application was crashing at runtime, most likely due to the "hardenedRuntime": true being enabled in the electron builder config json.


I didn't find most of this information in one coherent place so it was a lot of piecing bunch of disparate bits of information together. I hope this helps someone in the future.