saucelabs / node-saucelabs

A wrapper around Sauce Labs API
Apache License 2.0
93 stars 44 forks source link

Periodic ENOENT running in CI #86

Open nathancahill opened 4 years ago

nathancahill commented 4 years ago

This is an error that happens >50% of the time on CircleCI. No changes are made between one run and the next. One run might pass, the next will fail with this error. All versions are committed in yarn.lock.

12 06 2020 08:29:13.851:ERROR [SaucelabsLauncher]: Error: spawn /home/circleci/repo/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc ENOENT
    at Process.ChildProcess._handle.onexit (internal/child_process.js:268:19)
    at onErrorNT (internal/child_process.js:468:16)
    at processTicksAndRejections (internal/process/task_queues.js:84:21)

{
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawn /home/circleci/repo/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc',
  path: '/home/circleci/repo/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc',
  spawnargs: [ '--version' ],
  killed: false,
  stdout: '',
  stderr: '',
  failed: true,
  signal: null,
  cmd: '/home/circleci/repo/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc --version',
  timedOut: false
}
christian-bromann commented 4 years ago

@nathancahill can you try to run the pipeline without cache? I am afraid that the cache that includes the Sauce Connect binary is used for different environments.

christian-bromann commented 4 years ago

ping @nathancahill

nathancahill commented 4 years ago

Thanks, I think removing the cache solved the issue. I'll reopen in the future if it reappears.

nathancahill commented 4 years ago

Even with the cache disabled we're still getting periodic errors.

Previously, with the cache, the error rate was probably 50%. Without the cache, it's around 10%.

Sam55555 commented 4 years ago

The error occurred too on each run after gitlab runner cached on gitlab ci using the docker executer. No error during the first run and clean cache.

enriquegh commented 4 years ago

Haven't been able to reproduce on a private Gitlab instance with a docker runner so far.

I was able to reproduce, however, if I was using the node alpine image:

 $ NODE_OPTIONS=--trace-warnings ts-node index.ts
Error: spawn /builds/enrique/sauce-sc-example/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc ENOENT
    at Process.ChildProcess._handle.onexit (internal/child_process.js:268:19)
    at onErrorNT (internal/child_process.js:468:16)
    at processTicksAndRejections (internal/process/task_queues.js:84:21) {
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawn /builds/enrique/sauce-sc-example/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc',
  path: '/builds/enrique/sauce-sc-example/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc',
  spawnargs: [ '--version' ],
  killed: false,
  stdout: '',
  stderr: '',
  failed: true,
  signal: null,
  cmd: '/builds/enrique/sauce-sc-example/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc --version',
  timedOut: false
}

This gave an error 100% of the time though and not intermittently.

enriquegh commented 4 years ago

From @joventuraz:

it looks like the node module is not available, if there happened to be more than one concurrent build and one is finishing and cleaning up, it might affect the other build if they are using the same workspace which it seems like they might from the path in the error

Adding to this I think the issue lies somewhere in bin-wrapper and how it does its checks for the file/folder.

It seems like bin-wrapper is a collection of small modules so it was a bit hard to pinpoint exactly what is missing.

Some things we could try and do:

christian-bromann commented 4 years ago

I proposed a change in the Sauce Connect Launcher to retry starting Sauce Connect if it fails: https://github.com/karma-runner/karma-sauce-launcher/pull/219

Let's see if we can fix this issue with that.

christian-bromann commented 3 years ago

@nathancahill have you experienced the same issue after updating the package?

Seamoo13 commented 3 years ago

@christian-bromann Are there any updates on this? I am also having this issue sauceConnectIssue

markcellus commented 3 years ago

~I'm having this issue too using Node 14 alpine (14.16.1). But is no longer an issue after downgrading node back to 12.~ Just verified that this is not the case.

Seamoo13 commented 3 years ago

The issue went away for my project for a few months, then suddenly returned and is occurring roughly 50% of the time (Travis CI).

2021-07-02_3

"sauce-connect-launcher": "1.3.2", "@wdio/sauce-service": "7.5.7", "@wdio/cli": "7.5.7",

LeParadoxHD commented 2 years ago

Any update on this?

enriquegh commented 2 years ago

@wswebcreation would 7.1.3 fix this as bin-wrapper is now replaced?

wswebcreation commented 2 years ago

Hi @enriquegh

I'm not 100% sure, we simplified the download now, so it's worth a try

enriquegh commented 2 years ago

I'm going to close this ticket since bin-wrapper is no longer used and no new reports have been made. If someone still runs into the issue, we can re-open it.

Wolftousen commented 2 years ago

Not sure if a new issue should be opened up for this, but we are experiencing this issue also, here is the wdio/saucelab info from out package.json:

"@wdio/cli": "^7.19.7",
"@wdio/config": "^7.16.13",
"@wdio/cucumber-framework": "^7.19.7",
"@wdio/local-runner": "^7.19.7",
"@wdio/sauce-service": "^7.20.2",
"@wdio/selenium-standalone-service": "^7.16.13",
"@wdio/spec-reporter": "^7.19.7",
"@wdio/static-server-service": "^7.16.13",
"@wdio/types": "^7.16.13",

And here is the output we are getting with CircleCI:

Error: spawn /project_path/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc ENOENT at Process.ChildProcess._handle.onexit (internal/child_process.js:274:19) at onErrorNT (internal/child_process.js:469:16) at processTicksAndRejections (internal/process/task_queues.js:82:21) { errno: -2, code: 'ENOENT', syscall: 'spawn /project_path/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc', path: '/project_path/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc', spawnargs: [ '--no-autodetect', '--tunnel-name=team_tunnel_name', '--user=saucelabs_user', '--api-key=saucelabs_key', '--region=aws-region' ] }

I added some logging in the pipeline after looking at my local node_modules and found that while in the pipeline, node_modules/saucelabs/build/sc-loader does not exist at all. However, if you look at node_modules/saucelabs/sc-loader/.sc-v4.8.0/bin, it does contain sc in it. So not sure what the purpose of having downloaded twice is

I was able to add these commands into my test run job to trick wdio/saucelabs into using the sc that exists in saucelabs/sc-loader instead of saucelabs/build/sc-loader:

mkdir -p node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin ln -s node_modules/saucelabs/sc-loader/.sc-v4.8.0/bin/sc node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc

That cause the ENOENT error to go away, but now I get the error referenced in this issue: https://github.com/webdriverio/webdriverio/issues/5900

enriquegh commented 2 years ago

I've re-opened this as it's essentially the same issue as before. Looks like we replaced bin-wrapper for the download module (link).

We have something that in theory checks for the binary to be there so not sure why this is happening.

kleinbs commented 1 year ago

We are running into the same issue here running in an alpine docker container as part of a gitlab ci job. Is this still being worked on, or is there a good work around?

Execution of 2 workers started at 2023-01-20T17:59:03.464Z
2023-01-20T17:59:03.526Z DEBUG @wdio/utils:initialiseServices: initialise service "sauce" as NPM package
2023-01-20T17:59:04.939Z INFO @wdio/cli:launcher: Run onPrepare hook
2023-01-20T17:59:04.942Z INFO @wdio/sauce-service: Starting Sauce Connect Tunnel
Error: spawn /builds/consumer-identity/OneID/infra-eng/infra-source-bundles/my-account/e2e/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc ENOENT
    at Process.ChildProcess._handle.onexit (node:internal/child_process:283:19)
    at onErrorNT (node:internal/child_process:478:16)
    at processTicksAndRejections (node:internal/process/task_queues:83:21) {
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawn /builds/consumer-identity/OneID/infra-eng/infra-source-bundles/my-account/e2e/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc',
  path: '/builds/consumer-identity/OneID/infra-eng/infra-source-bundles/my-account/e2e/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc',
  spawnargs: [
    '--verbose',
    '--logfile=-',
    '--no-autodetect',
    '--tunnel-name=SC-tunnel-40602318395085324',
    '--no-ssl-bump-domains=127.0.0.1,localhost,1[72](https://gitlab.disney.com/consumer-identity/OneID/infra-eng/infra-source-bundles/my-account/-/jobs/18227172#L72).17.0.3',
    '--user=MyAccountTestRunner',
    '--api-key=7ee95499-d176-4eed-ace0-0c83a4ec6e[74](https://gitlab.disney.com/consumer-identity/OneID/infra-eng/infra-source-bundles/my-account/-/jobs/18227172#L74)',
    '--region=us-west-1'
  ]
}
naruaway commented 6 months ago

I think probably this is one of the root cause https://github.com/saucelabs/node-saucelabs/pull/241