nebari-dev / nebari

🪴 Nebari - your open source data science platform
https://nebari.dev
BSD 3-Clause "New" or "Revised" License
279 stars 91 forks source link

[BUG] - PR pytests are failing, possibly because repo env vars are not available #2783

Open tylergraff opened 1 day ago

tylergraff commented 1 day ago

Describe the bug

pytests are failing for at least several PRs. Logs indicate that an exception is being raised by many tests due to various env vars not being set:

==================================== ERRORS ====================================
_ ERROR at setup of test_set_config_from_environment_variables[nebari_config_options1] _
Traceback (most recent call last):
  File "/usr/share/miniconda/envs/nebari-dev/lib/python3.10/site-packages/_pytest/runner.py", line 342, in from_call

  <snip>

    File "/usr/share/miniconda/envs/nebari-dev/lib/python3.10/os.py", line 680, in __getitem__
    raise KeyError(key) from None
KeyError: 'AWS_ACCESS_KEY_ID'

See full logs here:
https://github.com/nebari-dev/nebari/actions/runs/11386526924/job/31851899812?pr=2752

Expected behavior

PR Pytests should pass.

OS and architecture in which you are running Nebari

This is an issue with the existing github Nebari test pipeline.

How to Reproduce the problem?

Example failing PRs:
https://github.com/nebari-dev/nebari/pull/2752
https://github.com/nebari-dev/nebari/pull/2730

Command output

No response

Versions and dependencies used.

No response

Compute environment

None

Integrations

No response

Anything else?

No response

viniciusdc commented 1 day ago

Thanks for raising the issue, @tylergraff . Indeed, this has come to our attention since last week. We are not entirely sure why this started happening now. My main guess is that this was related to our change in branches and some internal global environment variables from the nebari-dev org.

This is quite troublesome since it blocks development in any given PR upstream that alters any files under src. In the ideal world, these falling tests (likely affecting all providers, at least AWS and GCP so far) should not need credentials at all and only do so due to the test init part of the unit tests under it.

The best-case scenario would be to decouple such checks completely from contests since we already have a separate workflow. However, a more likely fix will be to mock the cloud API requests that require credentials using pytest.mock.

Another quick fix for any affected PR right now would be to copy the vault block that we have under our cloud provider tests workflow and have a similar step in these falling tests to populate the env vars with the missing credentials.

tylergraff commented 1 day ago

@viniciusdc the vault block fix - it seems this should be a separate PR which gets merged to fix the tests. Do you agree?

joneszc commented 4 hours ago

@tylergraff @viniciusdc I created a new Fork on nebari-dev main as opposed to using the MetroStar/nebari, which is forked on the previous default develop branch. Unfortunately, my new test PR#2788, forked on main branch, still fails the same tests