Open tswast opened 3 months ago
I'm not sure if it's a good idea, but we can add lower-bound-checker update
to owlbot.py to ensure constraints-3.x.txt are updated for each commit. WDYT? @tswast
That could work. I would expect owlbot to generate the constraints and noxfile to check them.
I'm not sure how well the generation works, but if it does let's use it.
I walked through the code in that PR. There is a lot of code across multiple files (close to 700 lines).
I am presuming we are considering importing this google-cloud-testutils
library during our CI/CD checks rather than adding all that code to our repo?
If yes, we should also mirror that same process to the other repos that have lower bounds in their constraints files.
As for doing the checks... These folks have two separate nox sessions in their nox file:
check_lower_bounds
update_lower_bounds
The check_lower_bounds
session runs every time.
The update
session is run locally by the developer to update the lower bounds when needed, same way we might run blacken
OR lint
locally before pushing a commit.
I am presuming we are considering importing this google-cloud-testutils library during our CI/CD checks rather than adding all that code to our repo?
We already install google-cloud-testutils
in most of our nox sessions if I remember correctly, so that would be the correct approach.
The check_lower_bounds session runs every time.
Yes, I think the lint for the setup.py file would be a good spot for that.
The update session is run locally by the developer
I was thinking owlbot could do this too, just like we run our blacken session https://github.com/googleapis/python-bigquery/blob/4383cfe7c7571ffaa3efd5e45ca33c6ff978f274/owlbot.py#L104
The way the current code functions, lines such as this in setup.py
break the processing:
"protobuf>=3.20.2,<6.0.0dev,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4",
Lines with two endpoints is fine, but adding the extraneous versions fails.
"protobuf>=3.20.2,<6.0.0dev"
Order does not matter. Putting the 4.* versions between the two endpoints does not help. Similarly, putting spaces between the values after each comma makes no difference.
"protobuf>=3.20.2,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,<6.0.0dev",
@tswast do you have strong feelings about this Issue?
Some more context:
Their _lower_bound()
function has this in the docstring:
The lower bound can be determined for a requirement only if it is one of these
formats:
foo==1.2.0
foo>=1.2.0
foo>=1.2.0, <2.0.0dev
foo<2.0.0dev, >=1.2.0
It may be possible to issue a PR (or request this feature) to add something to their code to account for more complex situations... there would need to be an accounting for edge cases, (are the != versions in the middle of the pack, at the end, at the beginning) etc.
Would need to include tests.
Another option is to not have requirements that are that complex, but sometimes we can't get around it (especially when there is a broken dependency that we need to weed out). In a check of the other repos in googleapis, a declaration to not use a specific version shows up ~450 times. Typically, but not exclusively, relative to google-api-core
, google-auth
, 'protobuf`.
do you have strong feelings about this Issue?
I think we've had a couple times in the last year of setup.py and constraints getting out of sync. These cause real customer issues where they end up with incompatible versions because we didn't test with our minimum advertised dependency versions.
IMO some automation is a high priority to prevent such errors in future. A contribution to google-cloud-testutils seems the right approach, but certainly there are alternatives like using the uv package manager to install instead of pip, since uv has a flag to install minimum supported versions instead of latest.
See: https://github.com/googleapis/python-test-utils/pull/8 and https://github.com/googleapis/python-bigquery/pull/1972#issuecomment-2221243127