AcademySoftwareFoundation / rez

An integrated package configuration, build and deployment system for software
https://rez.readthedocs.io
Apache License 2.0
940 stars 333 forks source link

[Feature] Overrule version #637

Closed mottosso closed 5 years ago

mottosso commented 5 years ago

Goal

Improve beta-testing experience of packages in production.

Related

Motivation

Oftentimes, a package is being developed but isn't yet ready for studio-wide consumption. At the moment, you could utilise package_filter to exclude anything with a certain prefix, like *.beta and then allow for the end-user to patch an environment without this filter.

package_a/package.py

name = "package_a"
version = "1.0.0"
requires = ["package_b-1"]

package_b/package.py

name = "package_b"
version = "1.1.0.beta"
requires = []
$ rez env package_a-1 --exclude "*.beta"
# package_a-1.0.0
# package_b-1.0.0
> $ rez env --patch packageA-1.1
# package_a-1.0.0
# package_b-1.1.0.beta

Which works well, is explicit and safe against accidentally picking up a non-beta version during the initial resolve.

But what about major versions? In order to support that, one would need to not mention any version, and that's not specific enough in the majority of cases where a specific version is intended.

requires = ["package_b"]

This then also applies to incrementing major versions of applications like Maya.

$ rez env spiderman maya
# maya-2018
> $ rez env --patch maya-2019
ERROR: Sorry, can't do it

Implementation

Enable a user to overrule Rez on which version of a package is used.

Before

$ rez env package_a-1
# package_a-1.0.0
# package_b-1.15.0

After

$ rez env package_a-1 +package_b-2.1.2
# package_a-1.0.0
# package_b-2.1.2
> $ rez env --patch +package_b-3
# package_b-3.1.2
>> $

Technically, any +package would note take part of the solve, but would be unconditionally added/replaced post-solve.

E.g.

from rez.resolved_context import ResolvedContext
from rez.packages_ import iter_packages

context = ResolvedContext(["package_a-1"])
package_b = next(iter_packages(["package_b-2"])

# Unconditionally overrule
context.resolved_packages["package_b"] = package_b
instinct-vfx commented 5 years ago

I am not sure i fully understand the rationale behind this. Rez gives you a configured environment after you tell it which packages you want. Rez does not answer the question which packages you want. And this feels a bit like it would be aim to make Rez do both.

i.e. the example you posted abouve: rez env spiderman maya

This request tells Rez that you don't care about the maya version at all, but you do (as you then want to switch to maya 2019:

rez env --patch maya-2019

I am maybe just missing the point here but taking packages out of the solve and then unconditionally add/replace sounds like asking for trouble to me.

As a side-note how we handle package lifecycle is that we have dedicated repositories for dev, staging and production, so getting dev packages is a matter of modifying your packages_path (and also gives us more rigid control to prevent the production floor to use non-stable releases by simply not giving them read access to the repo. But i can see how that may be overkill for a lot of users.

mottosso commented 5 years ago

Thanks for chiming in @instinct-vfx, the spiderman example assumes a e.g. requires = ["~maya-2018"] on part of the spiderman/package.py in which case Maya is only include when asked for explicitly. Sometimes, I'd like to test the pipeline using Maya 2019, using all of the same packages as for 2018. Likewise, when I've got Maya 2018 but have a new (major) version of some library, I'd like to kindly overrule whatever Rez decided to give me.

It's a debugging mechanism, something to avoid having to circumvent Rez in other ways, such as adding to the PYTHONPATH yourself post-solve and keeps everything nice and tidy.

In my specific usecase, artists have grown accustomed to (via a tool other than Rez) to be able to get a resolved environment, but then overrule any version as they please. To use the resolver as a way to get most of the way there, and then get out of the way.

Without this feature, and in this particular instance, Rez is being overly protective to these artists and force developers to have to get involved to change Rez path's or update packages.

instinct-vfx commented 5 years ago

Just to help me get it: spiderman is a "dummy" package that keeps a shows desired packages? Because if any actual package specifies a requires then that should be a techical requirement and patching that would be kinda pointless, no?

mottosso commented 5 years ago

I think it's referred to as "bundle" in Rez-speak, but yes, a kind of dummy package with nothing but requirements.

But I think the same applies to any package, even those that have particular build-requirements. Have you got a package requiring qt-4 but want to try to see whether it builds with qt-5?

$ rez env my_qt4_package
> $ cmake .
# Success!
> $ exit
$ rez env my_qt4_package +qt-5
> $ cmake .
# Maybe!
instinct-vfx commented 5 years ago

That does not reflect the way i would handle that use-case. I am very interested to hear other people's opinions and approaches! If i am working on a package and want to update its dependencies i would simply do just that in package.py and run rez-build.

If there is no real blocker for a specific version of a dependency i would also not set it to require qt-4, i'd set it to qt > 4 and then specify the version needed while building.

Furthermore for something like qt (or a DCC host) i would rather have it as a variant than a requirement because i often want the package version to support both qt4 and5 (or Max 2019 and 2020). In that case to test building against a new qt version i'd simply add that variant and build it.

mottosso commented 5 years ago

That's fair. This feature isn't really intended for package authors, but for users of Rez. In particular, artists who aren't interested or able to edit a "package.py". For them to have some way of explicitly adding a package.

Another usecase is this.

And so the package author can carry on polishing up 2.1 and eventually releasing it to the crew.

mottosso commented 5 years ago

To complete the example..

Before

Hours to days later

Hours to days later..

You get the gist. :)

instinct-vfx commented 5 years ago

This would only occur if you require a specific version in a package, which i would only do if there is a technical limitation (which in turn would mean that reverting would not work anyways).

I would put the version of the dependent package in the request, not the package. That way the end-user can simply adapt their request rather than requiring package changes.

Our approach is that packages define technical dependencies and requirements and the request handles what packages an end-user wants.

mottosso commented 5 years ago

Our approach is that packages define technical dependencies and requirements and the request handles what packages an end-user wants.

That's a fine approach, nothing wrong with that.

This feature is primarily for when you aren't managing those user-requests separately, but alongside other packages in your repository, like a project or asset.

nerdvegas commented 5 years ago

"""Our approach is that packages define technical dependencies and requirements and the request handles what packages an end-user wants."""

This really does encapsulate what rez is all about - distinguishing between the two is a strong design decision in rez, and has been from day one. Providing built-in support for circumventing its main design goal - giving the user a configured env that meets the technical requirements defined by the packages in the request - doesn't sit well for me. Furthermore, it will add non-trivial complexity - it will affect the solver and its graph generation for starters.

Wrt example given above:

$ rez env my_qt4_package
> $ cmake .
# Success!
> $ exit
$ rez env my_qt4_package +qt-5
> $ cmake .
# Maybe!

I agree also that this is not a typical workflow. For starters, it doesn't make sense to simply run cmake within a rez-env like that. The env that rez-build configures, and one that rez-env configures, is not necessarily equivalent. Not only are there different sets of package requirements that are used (build_requires and private_build_requires), but packages can also change their requirements depending on whether a build is taking place (via late-bound requires). The typical workflow for the case you describe is simply to get a local copy of the package in question, change its requirements in qt-5 in its package definition, and rebuild.

I do understand the motivation behind this request though, don't get me wrong. But that has to be weighed up against two things - (a) the added complexity necessary to achieve it, and (b) making it even possible for rez to produce a configured env that may not function as intended. One thing rez is great at currently, is giving devs and end-users a guarantee about the resolved env - if there's a problem with it, it's because of a faulty package definition (whether that be sloppy requirements, bad commands or whatever). A feature like this could easily be abused and end up in production.

mottosso commented 5 years ago

Providing built-in support for circumventing its main design goal - giving the user a configured env that meets the technical requirements defined by the packages in the request - doesn't sit well for me.

I hear you. I can think of no other comment than this being an expansion of what Rez is able to do and be used for. It may not fit the original design, and I would understand if that was reason enough not to include it.

Having said that, here's a working version of this feature that doesn't affect solver or graph generation.

from rez import resolved_context, packages_

context = resolved_context.ResolvedContext("myPackage")

original = context.resolved_packages[0]  # e.g. python-2.7
replacement = next(packages_.iter_packages(original.name))  # e.g. python-3.6

context.resolved_packages.remove(original)
context.resolved_packages.append(replacement)

context.execute_shell(command="python")  # launch Python 3.6

At least this helps eliminate any technical reason to not consider it.

bpabel commented 5 years ago

That will replace the one package, but if the dependencies are different between the versions, those won't get changed.

mottosso commented 5 years ago

Yes, that's exactly the point!

Sorry if this wasn't more clear, this feature is to overrule the version of one specific package, from an already-resolved list of packages.

nerdvegas commented 5 years ago

This working version is fundamentally broken. A context's resolved_packages is not a list of Package objects, it's a list of Variant objects. Packages are collections of variants. An attempt to source the below context will fail, probably with an "object Package has no attribute 'root'" error. Which makes sense, because only variants have installed payload locations, not packages. If this example did indeed work, it's pretty much a fluke, and will only have done so with a package that doesn't do anything useful (in other words, doesn't refer to its payload in any way).

Brendan has already echoed my initial confusion - your request sounded as though you wanted a resolve to act as though every request for mypkg-A was turned into mypkg-B, which would have caused a lot of issues. The approach illustrated here is just replacing a specific variant, and in effect ignoring that variant's dependencies.

There are still multiple problems with this however.

Really though the big issue is that there's no way to sensibly select a variant to replace with. This feature might work for simple cases (packages with one variant), but it also has to work with other packages - and the majority have variants.

A

On Wed, Jun 5, 2019 at 8:09 PM Marcus Ottosson notifications@github.com wrote:

Providing built-in support for circumventing its main design goal - giving the user a configured env that meets the technical requirements defined by the packages in the request - doesn't sit well for me.

I hear you. I can think of no other comment than this being an expansion of what Rez is able to do and be used for. It may not fit the original design, and I would understand if that was reason enough not to include it.

Having said that, here's a working version of this feature that doesn't affect solver or graph generation.

from rez import resolvedcontext, packages

context = resolved_context("myPackage")

original = context.resolvedpackages[0] # e.g. python-2.7 replacement = next(packages.iter_packages(original.name)) # e.g. python-3.6

context.resolved_packages.remove(original) context.resolved_packages.append(replacement)

context.execute_shell(command="python") # launch Python 3.6

At least this helps eliminate any technical reason to not consider it.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/nerdvegas/rez/issues/637?email_source=notifications&email_token=AAMOUSWDEKQD23SI5ZWXBILPY6GHPA5CNFSM4HP3E6QKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODW7H3UQ#issuecomment-499023314, or mute the thread https://github.com/notifications/unsubscribe-auth/AAMOUSXO5XQUZB5D6BG6CL3PY6GHPANCNFSM4HP3E6QA .

mottosso commented 5 years ago

Now we're talking!

A context's resolved_packages is not a list of Package objects

Yes, you are right. I mistakenly did not include this line. Here's the updated version.

from rez import resolved_context, packages_

context = resolved_context.ResolvedContext("myPackage")

original = context.resolved_packages[0]  # e.g. python-2.7
replacement = next(packages_.iter_packages(original.name))  # e.g. python-3.6
replacement = next(replacement.iter_variants())  # e.g. windows-10/python-3.6

context.resolved_packages.remove(original)
context.resolved_packages.append(replacement)

context.execute_shell(command="python")  # launch Python 3.6

your request sounded as though you wanted a resolve to act as though every request for mypkg-A was turned into mypkg-B, which would have caused a lot of issues.

Yes, this is exactly right. I actually don't see any issues; this is an action explicitly performed by the user of Rez, picking an exact version. If anything were to break (which it very well may) the reason is clear, and that's the idea of testing it in the first place. "Does it break?"

What issues did you have in mind?

The suggested syntax doesn't give enough control.

Yes and no, doesn't it provide the same level of control as any other request?

$ rez env python  # What variant?

Introducing this overrule syntax would break that design decision.

Perhaps adjusting syntax isn't the way to go. Considering this is only relevant post-solve, perhaps it's better suited for an in-context type of command.

$ rez env python-2.7
> $ rez env --add python-3.6

Which, like --patch adds python-3.6 to the list of packages, regardless of what already exists - replace or no - but unlike --patch does so disregarding any constraints. Essentially a post-solve modifier, like the above example implementation.

The graph representation of the altered Context is now incorrect.

Are you referring to the visual representation? I'm sure it's within our ability to merely search-and-replace python-2.7 for python-3.6, and perhaps colour it magenta or the like to highlight that it stands out.

It's specifying a package, not a variant. And since it's circumventing the solve, there's no way to select the most appropriate variant, because there isn't one.

This is a real issue however. To pick an example, if I had a python-3.6 package for both Windows and Linux, it's possible the example solution would add a Linux package to a Windows environment.

Spontaneously, I'd imagine something along these lines to resolve (pun!) that.

from rez import resolved_context, packages_

context = resolved_context.ResolvedContext("myPackage")

original = context.resolved_packages[0]  # e.g. python-2.7
replacement = resolved_context.ResolvedContext("python-3.6")
replacement = replacement.resolved_packages[0]

context.resolved_packages.remove(original)
context.resolved_packages.append(replacement)

context.execute_shell(command="python")  # launch Python 3.6

Now the request goes through the same scrutiny as a resolve, without actually being constrained by a prior request.

Finally, I'd like to bring your attention back to what problems this feature is meant to solve. It's possible there are other means of addressing those that overrule (double pun!) this feature; that would be more relevant than any technical means of implementing this particular approach. Let me know what you think!

bpabel commented 5 years ago

Applying any changes like this post-solve is just so unlikely to actually work in all but the simplest of cases where packages have no dependencies and few variants. There's no point switching out package-1 for package-2 if I'm not updating any of it's dependencies.

I really only see this being useful if it could be applied pre-solve. Something that would function more akin to the css !important tag, where it gets applied regardless of any other requests for that package. That would be a much larger change though -- to the Solver, the Requirement, and all the code that deals with them.

mottosso commented 5 years ago

Ok, to ensure we're all having the same picture, and considering the little example solution helped clarify the goal, here's another step towards a more complete picture.

In this example, I'm leveraging Rez to help resolve a context, and then taking control over what packages I actually end up using. Now Rez is just a tool, like any other, helping a user and developer along the way to a desired context.

nerdvegas commented 5 years ago

I agree with Brendan that this approach - replacing a variant post-resolve

The example given only takes into account the simplest of situations, where a package has variants per platform. This is a very narrow case - it is common for packages to have many more variants than this, based on a multitude of their requirements.

Right now we do not have the ability to explicitly specify a variant in a package as part of a request. This is actually something we do need, for reasons unrelated to this issue. Consider the following package:

name = 'foo'
version = '1.0.0'
variants = [
  [],
  ["maya-2017"],
  ["houdini-18"]
]

These variants are non-mutually exclusive. There are rules about which one will be chosen, but it quickly becomes virtually impossible to intuitively know which variant you're going to end up with. There are cases where, for eg, maya is present in a resolve, but you don't end up with the maya-based variant of a package. What is needed (and demonstrably needed, today) is a syntax for narrowing down requests for varianted packages. For example, perhaps the request "rez-env foo/maya" would force selection of a foo variant that overlaps with the request for "maya" - in other words, the ["maya-2017"] variant.

I see the feature explained above as a blocker to the feature you're requesting here, because without the ability to specify the variant you want to replace with, this would only be useful in certain narrow cases. Because of this, it would be misleading to have first class support for it in API/cli tools.

Furthermore, aside from the reasons give above, this is also only useful if the new package getting tested doesn't have differing dependencies to the version getting replaced. Because if it does, then the dependent package versions in the resolve would be wrong, and the package isn't going to work anyway.

If you haven't already, you should read: https://github.com/nerdvegas/rez/wiki/Variants#mutual-exclusivity

I would also suggest that if you have an interest in this kind of thing, then the variant selection feature described above would be an excellent addition to the project.

A

On Thu, Jun 6, 2019 at 5:31 PM Marcus Ottosson notifications@github.com wrote:

Ok, to ensure we're all having the same picture, and considering the little example solution helped clarify the goal, here's another step towards a more complete picture.

https://user-images.githubusercontent.com/2152766/58946617-3cf79b80-877e-11e9-8887-df9a92cb1851.gif

In this example, I'm leveraging Rez to help resolve a context, and then taking control over what packages I actually end up using. Now Rez is just a tool, like any other, helping a user and developer along the way to a desired context.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/nerdvegas/rez/issues/637?email_source=notifications&email_token=AAMOUSWRHU5RKR3ZY3FPOBTPZC4L3A5CNFSM4HP3E6QKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODXB7R2Q#issuecomment-499382506, or mute the thread https://github.com/notifications/unsubscribe-auth/AAMOUSVERZLSWTEBGOSYOQDPZC4L3ANCNFSM4HP3E6QA .

mottosso commented 5 years ago

Thanks guys! I spot three lines of thought so far.

  1. It won't work because of variants
  2. It won't work because of dependencies
  3. It doesn't cover the majority case

(1) and (2) is already solved by the previous example I think? By using another ResolvedContext, you'll take into account any dependencies for the replaced package, alongside variants. We could expand on it, to not only include one of the resolved packages, but all of them.

from rez import resolved_context, packages_

context = resolved_context.ResolvedContext("myPackage")

original = resolved_context.ResolvedContext("python-2.7")
original = original.resolved_packages
replacement = resolved_context.ResolvedContext("python-3.6")
replacement = replacement.resolved_packages

for package in original:
  context.resolved_packages.remove(package)

for package in replacement:
  context.resolved_packages.append(package)

context.execute_shell(command="python")  # launch Python 3.6

And when the day comes where you're able to request variants too, you'd simply include it.

original = resolved_context.ResolvedContext("python-2.7/maya")

However, one issue with this - both now and with a variant request - is that it obfuscates the final result. You now can't know for sure which packages are actually added or removed, which brings me to the final point.

(3) could be turned on its head. Because of the simplicity of the feature, it's also intuitive. The user knows what to expect. And perhaps most importantly, if anything does break - either due to alternating dependencies or variants - the reason is apparent; it's because you overruled the almighty Rez and are now paying the price.

To reiterate, the idea of this feature is to overrule Rez. To take manual control. To fill in for parts where Rez needs to be a means to an end, not judge and jury.

nerdvegas commented 5 years ago

On Fri, Jun 7, 2019 at 3:55 PM Marcus Ottosson notifications@github.com wrote:

Thanks guys! I spot three lines of thought so far.

  1. It won't work because of variants
  2. It won't work because of dependencies
  3. It doesn't cover the majority case

(1) and (2) is already solved by the previous example I think? By using another ResolvedContext, you'll take into account any dependencies for the replaced package, alongside variants. We could expand on it, to not only include one of the resolved packages, but all of them.

No, not at all. The only way to guarantee that a resolve contains a package's requirements is to include that package in the request, and perform a resolve. Any attempt to replace a variant post-solve will not guarantee that its requirements are met. It's literally the job of the solver to do this. The only way you could try, as you allude to, is to perform a second resolve, adding the resolve list of the first request to that solve. But that won't work either, because there's every chance that other packages in the resolve list are just going to pull in the previous version of the package you're trying to replace again, and a collision results! Ie - the second resolve is just going to pull in python-3.6 again, and will fail.

"""you'll take into account any dependencies for the replaced package, alongside variants""" This doesn't make any sense.

from rez import resolvedcontext, packages

context = resolved_context.ResolvedContext("myPackage")

original = resolved_context.ResolvedContext("python-2.7") original = original.resolved_packages replacement = resolved_context.ResolvedContext("python-3.6") replacement = replacement.resolved_packages for package in original: context.resolved_packages.remove(package) for package in replacement: context.resolved_packages.append(package)

There are a number of reasons why this won't work. One is that you're assuming the dependencies of the variant you're replacing aren't shared by other packages in the resolve. That is not a safe assumption, at all. Furthermore, doing an isolated resolve of the replacement package could easily pick up completely the wrong variant, and then inject any number of packages into the first solve, that now cause conflicts.

This does not work, and there's no way to intuitively understand in what way the resulting environment could be broken. There is simply too much information to take into account.

context.execute_shell(command="python") # launch Python 3.6

And when the day comes where you're able to request variants too, you'd simply include it.

original = resolved_context.ResolvedContext("python-2.7/maya")

However, one issue with this - both now and with a variant request - is that it obfuscates the final result. You now can't know for sure which packages are actually added or removed, which brings me to the final point.

(3) could be turned on its head. Because of the simplicity of the feature, it's also intuitive. The user knows what to expect. And perhaps most importantly, if anything does break - either due to alternating dependencies or variants - the reason is apparent; it's because you overruled the almighty Rez and are now paying the price.

To reiterate, the idea of this feature is to overrule Rez. To take manual control. To fill in for parts where Rez needs to be a means to an end, not judge and jury.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/nerdvegas/rez/issues/637?email_source=notifications&email_token=AAMOUSQHNHK7AEJLMRVDV33PZHZ4PA5CNFSM4HP3E6QKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODXE5IRI#issuecomment-499766341, or mute the thread https://github.com/notifications/unsubscribe-auth/AAMOUSTMZNGAPZ53MVQD46TPZHZ4PANCNFSM4HP3E6QA .

mottosso commented 5 years ago

the second resolve is just going to pull in python-3.6 again, and will fail.

Oh, no it won't? The second resolve is independent of the first I think, or at least it should be. By asking for just a single package, there is no possibility of collision or failure - so long as the package itself exists. Am I reading this right?

This doesn't make any sense.

Sorry, I'll try and give an example. This was in response to Brendan's:

There's no point switching out package-1 for package-2 if I'm not updating any of it's dependencies.

Which I understood as, if package-1 has the following dependencies.

PyQt4-4.7
maya-2017

And package-2 had:

PySide2-5.11
maya-2019

That there wouldn't be any point in updating to package-2 since it would likely break anyway. This would have been true if we only updated package-2, but if we update each package in a newly resolved context, then they would also be included.

Before

replacement = resolved_packages[0]

After

replacement = resolved_packages

Did I misinterpret the problem?

There are a number of reasons why this won't work. One is that you're assuming the dependencies of the variant you're replacing aren't shared by other packages in the resolve. That is not a safe assumption, at all.

Yes, but again, manual control. Safety is great, but it's safe (pun!) to skip when overruling. I might even say it was expected. Auto-pilot = Off.

doing an isolated resolve of the replacement package could easily pick up completely the wrong variant, and then inject any number of packages into the first solve, that now cause conflicts.

Yes, I agree, overruling should only ever affect a single package, not any of the replacement or original dependencies.

This does not work, and there's no way to intuitively understand in what way the resulting environment could be broken. There is simply too much information to take into account.

The gif example I gave is the common case for us, if not a little extreme. A more common case would be updating a version past its top-constraint, e.g. from 1.1 to 1.2.

It's likely true that there are far more complex examples where this would not work, but I'm not sure why they need to. If a package is irreplaceable, then so be it, and a package author will need to update and re-release a new version. If anything, it would encourage the package author to design his packages simply.

Technicalities aside, how would you address the original problem? How would you address an artist having an issue with version 1.1 that's been fixed in 1.2, without affecting surrounding artists? With this feature, it's a chat message away. What's the alternative?

instinct-vfx commented 5 years ago

I may be wrong, but i think the main problem with your approach in the gif is that you seem to be using Rez to manage both the request and the requirements/dependencies. (By wrapping your request in another package). That means if you want to change your request, you have to change a package (which an Artist typically can not or should not do). And because updating a package is non-trivial this feature would step in to reduce the pain at quite some cost.

Instead you should only manage the technical limitations and dependencies in the packages (and Rez) and manage the request outside of Rez. In such a case the package may depend on python rather than python-2.7. Then the update is also just a chat message away: Will you please update your request to include python-3.6 and see if that fixes your problem?

mottosso commented 5 years ago

In such a case the package may depend on python rather than python-2.7. Then the update is also just a chat message away: Will you please update your request to include python-3.6 and see if that fixes your problem?

You've hit the nail on the head. This is exactly what I'd like to do, and what isn't possible without this feature.

Consider this.

A

# myPackage/package.py
name = "myPackage"
version = "1.0"
requires = [
  "python-2.7"
]

With rez env myPackage, I would get python-2.7.

Now consider your proposed solution.

B

# myPackage/package.py
name = "myPackage"
version = "1.0"
requires = [
  "python"
]

Assuming there was a python-3.6, then rez env myPackage would result in python-3.6 for everyone.

What I'd like to have happen is for A to yield python-2.7, and for me to be able to overrule that decision. Post-solve.

instinct-vfx commented 5 years ago

Assuming there was a python-3.6, then rez env myPackage would result in python-3.6 for everyone.

No it wouldn't. You need to manage the request. The request would first be rez-env myPackage-1.0 python-2.7. Everyone gets python-2.7. Want to test with 3.6? Message them to use rez-env myPackage-1.0 python-3 for example.

This could also be done using the --patch option of rez-env. So you may resolve a baked (as rxt) context and then patch only the versions that you update. --patch changes the request and resolves if i am not mistaken.

mottosso commented 5 years ago

You need to manage the request.

In this small example, that would of course be trivial, and makes sense for e.g. Python and Maya etc. But that's just an example; any package could get overruled.

From a small request, like rez env myPackage python-3.6, there will be many packages resolved, and those are the ones I'm interested in overruling.

This was a slippery feature to describe! So make sure we're not dismissing anything due to misunderstanding, here's another example.

$ rez env alita maya
resolved by me@here, on Fri Jun 07 11:08:44 2019, using Rez v2.30.0b1

requested packages:
alita
maya
~platform==windows       (implicit)
~arch==AMD64             (implicit)
~os==windows-10.0.18362  (implicit)

resolved packages:
alita-0.3.17          c:\users\marcus\packages\alita\0.3.17              (local)
base-1.2.1            c:\packages\base\1.2.1
core_pipeline-2.1.0   c:\packages\core_pipeline\2.1.0
ftrack-1.0.0          c:\packages\ftrack\1.0.0
gitlab-1.2.0          c:\packages\gitlab\1.2.0
maya-2018.0.2         c:\users\marcus\packages\maya\2018.0.2             (local)
maya_base-1.0.8.beta  c:\packages\maya_base\1.0.8.beta
mgear-2.4.1           c:\packages\mgear\2.4.1
python-2.7            c:\packages\python\2.7
welcome-1.0           c:\packages\welcome\1.0

> $

All good so far. Now I'd like to try this context using a newer version of base.

> $ rez env --add base-2.0
requested packages:
alita
maya
+base-2.0
~platform==windows       (implicit)
~arch==AMD64             (implicit)
~os==windows-10.0.18362  (implicit)

resolved packages:
alita-0.3.17          c:\users\marcus\packages\alita\0.3.17              (local)
base-1.2.1            c:\packages\base\2.0.3                             (overruled)
core_pipeline-2.1.0   c:\packages\core_pipeline\2.1.0
ftrack-1.0.0          c:\packages\ftrack\1.0.0
gitlab-1.2.0          c:\packages\gitlab\1.2.0
maya-2018.0.2         c:\users\marcus\packages\maya\2018.0.2             (local)
maya_base-1.0.8.beta  c:\packages\maya_base\1.0.8.beta
mgear-2.4.1           c:\packages\mgear\2.4.1
python-2.7            c:\packages\python\2.7
welcome-1.0           c:\packages\welcome\1.0

>> $ 

Again, I'm all ears as to how this can be achieved in any other way. The key aspect is that the user is able to make this overruling, so it can't go past a package author.

instinct-vfx commented 5 years ago

The key aspect is that the user is able to make this overruling, so it can't go past a package author.

You do so by removing that information from the package. Where is base-1.2.1 defined as a dependency? I assume that's in the alita package which is defining a show? And that's what i mean. You are managing your request as a package.

Instead make the request:

base-1.2.1 core_pipeline-2.1.0 ftrack-1.0.0 gitlab-1.2.0 maya-2018.0.2 maya_base-1.0.8.beta mgear-2.4.1 python-2.7 welcome-1.0 

And to version up base:

base-2.0 core_pipeline-2.1.0 ftrack-1.0.0 gitlab-1.2.0 maya-2018.0.2 maya_base-1.0.8.beta mgear-2.4.1 python-2.7 welcome-1.0 

This will fail only if any of the packages explicitly requires base to be <2 or alike. And if there is a technical limitation to do so, then it should fail.

mottosso commented 5 years ago

There are a few too many assertions there.

  1. base-1.2.1 is a requirement of mgear-2.4.1
  2. mgear-2.4.1 is a requirement of core_pipeline-2.1.0

What now?

Perhaps more importantly, users won't be happy having to type that combination of packages out. That is probably not what you expect, instead you expect a second system to be built that manage the requirements for the two original packages - alita and maya - to which my response would be, "a system like Rez?" to which you'd reply "Yes, precisely!" and then we get stuck in a loop, CPUs run hot until eventually the power cuts. Fade to black.

And if there is a technical limitation to do so, then it should fail.

Sigh. Am I really not getting through here? Overruling should not be overruled by Rez, that's the point.

It's one thing to disqualify an idea based on merit. But I think it's in each of our interests to make sure we disqualify the same idea.

instinct-vfx commented 5 years ago

Sigh. Am I really not getting through here?

That's a mutual feeling ;)

The system managing the request in my mind is very different from Rez. It is solving (pun intended) a different problem i think. I'd be interested in feedback from people that have such a system.

I think you are missing my point in regards to "should fail". A technical limitation would be a python module that is incompatible with python-3. So overruling does not make sense. If it IS compatible it should not require python-2.

Why is base-1.2.1 a requirement of mgear-2.4.1, why is it not base? If it is really expected to not be compatible, then overruling is pointless as it will break, if it is not, then the dependency is too specific.

I don't really understand how using a different request is different from the proposed syntax in the original post. You can also use pre-resolved contexts and use the patch feature, or make it a custom launcher, or use the rez-gui.

instinct-vfx commented 5 years ago

P.S. I can see that there are cases where you may have a lot of dependencies that need updating and how it can be a hassle in dev, but for production i don't really think so.

bpabel commented 5 years ago

Perhaps we need a new way for defining "requests" that has first-class support in rez.

In the past, I've generally stored "requests" as rez packages (what I've heard referred to as "bundles" -- a package that only has requirements), since that was the easiest way to manage them and simplify the command line usage with rez, and I could use the same build, test, and release workflow that I used for all the other packages. The downside is that it requires some boilerplate to manage and it's more a developer interface, not really end-users.

More recently, I've begun defining the launch requests in Shotgun (for use in Shotgun Desktop), but even that's not really ideal, since it only works with the launcher.

@instinct-vfx It seems like you think this should all be managed in a separate system, but specifying a set of dependencies for an environment is not that different than specifying them for a specific software package. And again, it's just another thing that every studio is working on separately.

The rezgui project seemed like an attempt at this (though I admit I haven't used it or looked into it very much).

nerdvegas commented 5 years ago

How requests are defined and stored is outside of the scope of rez. Rez's job is to give you the correct env given a request for a list of packages. The only storing of requests that occurs is in Context objects, which makes sense because a Context bakes out all info about how a specific resolve was solved.

You can store requests as packages in and of themselves, these so-called "bundles", but I don't think this is a good approach, for reasons that I've gone into at length in a separate thread. But you are free to do this if you want, because it's a valid package definition.

Rez-gui is not an attempt at storing requests. It's just a UI for resolving environments. It's particularly useful for debugging resolve issues - cases where a resolve doesn't give you the version of a package you expected (not because of a bug in rez, but because of an interaction between packages that wasn't taken into account).

"""specifying a set of dependencies for an environment is not that different than specifying them for a specific software package"""

They are completely different, and intermingling the two is a bad idea. The very reason rez was written is because of experience I had with two separate systems, at two separate studios (one of them major), where the concept of production management (what packages should we use in this shot?) and package management (if I require maya-2019 and anim_utils 2 or greater, how do I construct that env?) were closely coupled.

Specifying a set of dependencies for an environment is defining what you desire to have on a given show or shot, and this tends to change over time. Specifying a set of dependencies for a package is defining what that package needs in order to function correctly, and this is immutable.

Back to this thread. I'm just going to say it again - any attempt to add or replace variants post-resolve will not work. The only way to construct a resolve where the stated technical needs of every package (ie its requirements) are met, is to actually perform a resolve, because that's what the solver does. Wanting to do this to make it easy to test and update to a new major version (for eg) is pointless, because you've now created a situation where any of three different problems could cause something not to work:

How is anyone supposed to meaningfully test a new package in this scenario? There is too much uncertainty to know whether an issue is because of the new version, or because of a misconfigured env. Rez's entire reason for being is to give a guarantee that it constructs envs that meet the stated technical requirements of the packages requested. Breaking that guarantee is not something I am interested in supporting. Both systems I mentioned, prior to rez, suffered this problem - you never knew if a crash was due to a bug, or a misconfigured env. Removing even the possibility of the latter changed the game completely, and for the better.

The only way you could meaningfully support this sort of behaviour is to introduce a feature where you can dynamically redefine a package's requirements, before the solve. Basically, you would want to say, "for every request for 'maya-*' that you see in a package, replace it with 'maya-2019'". This way, you aren't trying to munge something into a resolve that's already happened - instead, you're changing package definitions before they're ingested into the solve. But this approach is problematic also - you're giving people the ability to alter the stated technical needs of other packages, when they may know nothing about those other packages. The guy who specified python-2.7 in his package may have done so for good reason. He might know that in python-3, even if his code is compatible, there's going to be some issue introduced that has not yet been taken into account.

With a major version change, standard practice is to change affected packages' requirements locally, rebuild, and test. How you distribute that test version is up to you, rez gives you a lot of options. You could rez-build --prefix to some test package path, and tell your artists to add that to REZ_PACKAGES_PATH, re-resolve, and test. You could do what Method does - we release *.dev package versions, and we have a default package filter that ignores these. To test, artists can turn that filter off via a simple cli arg. If you're using suites, you could just build a local suite with your new test versions, and point your artists at that.

A

On Sat, Jun 8, 2019 at 3:31 AM Brendan Abel notifications@github.com wrote:

Perhaps we need a new way for defining "requests" that has first-class support in rez.

In the past, I've generally stored "requests" as rez packages (what I've heard referred to as "bundles" -- a package that only has requirements), since that was the easiest way to manage them and simplify the command line usage with rez, and I could use the same build, test, and release workflow that I used for all the other packages. The downside is that it requires some boilerplate to manage and it's more a developer interface, not really end-users.

More recently, I've begun defining the launch requests in Shotgun (for use in Shotgun Desktop), but even that's not really ideal, since it only works with the launcher.

@instinct-vfx https://github.com/instinct-vfx It seems like you think this should all be managed in a separate system, but specifying a set of dependencies for an environment is not that different than specifying them for a specific software package. And again, it's just another thing that every studio is working on separately.

The rezgui project seemed like an attempt at this (though I admit I haven't used it or looked into it very much).

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/nerdvegas/rez/issues/637?email_source=notifications&email_token=AAMOUSQD5SYBY7SAJT5RHO3PZKLQHA5CNFSM4HP3E6QKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODXGPQTI#issuecomment-499972173, or mute the thread https://github.com/notifications/unsubscribe-auth/AAMOUSVC4ZOVX7UP7YMN2UDPZKLQHANCNFSM4HP3E6QA .

bpabel commented 5 years ago

How requests are defined and stored is outside of the scope of rez.

Is it though? Is this not what rez-suite was created to do?

nerdvegas commented 5 years ago

Rez-suite is a way to store a collection of contexts, and expose all their tools under a single bin/ dir as wrappers. It's a way to give end-users an homogeneous list of tools that each run within their own correctly configured env. Fairly analogous to multiple activated virtualenvs.

I mean look, sure we could perhaps introduce some new file format that contains just a request, like .rrq (Rez ReQuest) or something. But it'd just be a json/YAML containing the request list, and optionally extra config settings like package filters, timestamping and so on. There may be cases where this would be useful.

But anything fancier than that, like being able to override the rrq, would be immediately going outside of the scope of rez. That would be going down a road where you're implementing a production management system. Rez is already big enough, but big as it is, it's still staying true to its defined goal. Production management is expressly not its goal. It would be fantastic to have a separate project that does this though, and uses rez under the hood - in fact this is exactly what many studios have their own version of, Method included.

A

On Sat, Jun 8, 2019 at 9:46 AM Brendan Abel notifications@github.com wrote:

How requests are defined and stored is outside of the scope of rez.

Is it though? Is this not what rez-suite https://github.com/nerdvegas/rez/wiki/Suites was created to do?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/nerdvegas/rez/issues/637?email_source=notifications&email_token=AAMOUSQ5R5OS2ATF3ADVCMDPZLXOVA5CNFSM4HP3E6QKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODXHH4QI#issuecomment-500072001, or mute the thread https://github.com/notifications/unsubscribe-auth/AAMOUSRBBQAOPFMWFJXMPGDPZLXOVANCNFSM4HP3E6QA .

smaragden commented 5 years ago

I haven't read every word of this thread but have tried to follow it. I want to chime in on how we handle package requests at Important Looking Pirate.

We handle package requests based on different contexts, e.g. (filesystem location, user, department). We are doing this in several ways. (we are linux based)

Filesystem

I you type maya (or any command) in the terminal at any location. We will use the command_not_found_handler in bash which defaults to a defined function called __execute. What __execute does is investigate the context the command was called in, by default $PWD. It will check the file tree upwards until it finds a .ilp directory which contains context info. The context info contains configurations of what a resolve for maya means in that context. it could mean maya-2018, mtoa-3 if the user calling is a lighter or blender if you try to be funny. and if it can't find a context it will try with rez-env maya -c maya

On farm

On the farm we usually store the resolve the DCC was resolved in with timestamp to resolve a job submitted to the farm. But we can also use the __execute function and pass it a path like __execute --path /path/to/project/shots/shot1 maya "$@" to launch a command in a filesystem specific context.

Configuration

We configure these "contexts" in a way that always require package-major.minor version but don't allow setting patch version as we need to be able to pick up fixes. The show supervisor can configure the context packages and override it on a user or department basis(it's a python script, so you can do whatever you want ). An example would be something like (pseudocode):

if argv[0] =="maya" and user.department == "light" and shot == "testshot02":
    packages.replace("mtoa", "mota-beta-4")

or

if user.name == "some_testuser":
    packages.replace("maya_base", "maya_base-master")

We can also use the command_not_found_handle to create context aware alias requests like(very pseudocode):

if argv[0] == "maketx":
    packages = ("oiio",)
    return rez.resolved_context(packages).execute_shell(*argv[1:])
elif argv[0] == "die":
    return system("reboot now -h")

Package versioning

We also have rules for how we name our rez-packages to make sure we have control over resolves. A problem in the past could be that you had a package called clarisse-3.0-beta and the you got clarisse-3.0. The problem would be that if you request clarisse-3 you would get clarisse-3.0-beta. Now we always prefix the version, and you need to explicitly select a "special version" like:

My Conclusion

So the conclusion here is that we don't mess with the rez-resolve. We mess with the request. And I think that is what is argued in this thread, to mess with the request or mess with the resolve. For us messing with the request have worked great for years. And I have never felt the need to mess with the resolve. But you need to set up a system that are handling the requests and not let the users do that by default.

We do make it easy for users to change the request though, by printing a copy paste friendly line (rez-env pkg1 pk2 pkg3 ...) with every resolve if the user wants to change the request, but then they are on their own.

And I do remember i had frustrations before when the resolve didn't worked with me. But it all got resolved(pun intended), by managing the requests.

Just my two cents /Fredrik Brännbacka

smaragden commented 5 years ago

I should add that we also make sure we don't overrestrict the requirements in our packages.

we often do something like:

build_requires = [
    "arnold-5.0.0|5.1.0|5.2.0|5.3.0"
]
variants = [
    ["platform-linux", "arnold-5.0"],
    ["platform-linux", "arnold-5.1"],
    ["platform-linux", "arnold-5.2"],
    ["platform-linux", "arnold-5.3"],
]

This is to build against the lowest binary compatible version but allow any version to resolve the package.

We also never add patch version in our requirements to give room for patch releases. But we enforce major and most often minor version in requirements.

nerdvegas commented 5 years ago

Nice use of command_not_found_handler! I didn't think to make use of this in our system.

That aside, what ILP are doing isn't dissimilar to what Method do. We have something similar but by the sounds fancier. We have a hierarchy of request definitions (called "profiles"), that get merged together. So a show for eg can selectively redefine package versions. These don't reside on disk (they used to), but in a new service-based system that replicates across all our locations. It also keeps version history on all profiles, and that allows us to do some nifty stuff, like lock down profile definitions so that changes at the global level don't affect shows that are nearing completion (it also means we can roll back to any previous state). These features mean that our system for defining package requests for a given show/shot/app/etc do not bear any resemblance to packages.

Thx A

On Sat, Jun 8, 2019 at 10:48 AM Fredrik Brännbacka notifications@github.com wrote:

I should add that we also make sure we don't overrestrict the requirements in our packages.

we often do something like:

build_requires = [ "arnold-5.0.0|5.1.0|5.2.0|5.3.0" ] variants = [ ["platform-linux", "arnold-5.0"], ["platform-linux", "arnold-5.1"], ["platform-linux", "arnold-5.2"], ["platform-linux", "arnold-5.3"], ]

This is to build against the lowest binary compatible version but alow any version to resolve the package.

We also never add patch version in our requirements to give room for patch releases. Bu we enforce major and most often minor version in requirements.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/nerdvegas/rez/issues/637?email_source=notifications&email_token=AAMOUSWJQWHK3ZT5CZVDLC3PZL6UJA5CNFSM4HP3E6QKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODXHJRBI#issuecomment-500078725, or mute the thread https://github.com/notifications/unsubscribe-auth/AAMOUSUTQTNGQJOZS7U4CS3PZL6UJANCNFSM4HP3E6QA .

mottosso commented 5 years ago

Thanks guys! I think it sounds like we're all up to speed with regards to what this feature is about and that it isn't something that's interesting for Rez moving forward.

zachlewis commented 5 years ago

So... where exactly is all the love for Rez-suites? I’ve been vaguely following this thread, and I’d have thought by now there’d be more discussion re: suite implementation and use, because, imho, they serve as rez’s built-in support for many of the use-cases being discussed here.

That being said, Rez suites are an abstraction, logically and physically, from purely “run-time”, interactive / dynamic resolves — i.e., there’s some overhead in management — but not that much overhead. I feel like something like “dotenv” or similar could provide a quick-n-dirty means to hierarchically redirect $PATH to the desired production suite.

I guess my question is — are suites still supported? Are they just not used / well understood? Are more examples / expanded documentation needed, or is it not really worth the effort?

nerdvegas commented 5 years ago

Suites are totally still supported, so much so that they have their own documentation (ooh, fancy): https://github.com/nerdvegas/rez/wiki/Suites

Suites have their place, and we actually use them at Method in conjunction with our fancy dynamic system. Specifically, we use suites to configure global tools that we need to work at all times, even outside of a setshot-like environment. These cases don't require updates as often, and stability is more important. Suites are perfect for this.

There is also no reason you can't design a dynamic system that uses suites and is push-based. In other words, have a set of configs that define your package requests (ie, what we call "profiles"). Then, a change in any of these profiles would cause all affected suites to be re-created (ie, their contexts re-resolved). The upside to this is that all resolves are done ahead of time, so no artist will ever cop the hit of waiting for a fresh resolve when they run an app. The downside is that there are now other problems to solve - namely, how do you force the re-resolve of possibly hundreds of suites, due to a change to a global-level profile? And, given that rez caches resolves anyway - how is this better? You may as well have a dynamic system, which then has a 'warming' daemon that goes around forcing re-resolves in affected areas, so the resolve cache gets warmed.

On Wed, Jun 12, 2019 at 5:51 AM zachlewis notifications@github.com wrote:

So... where exactly is all the love for Rez-suites? I’ve been vaguely following this thread, and I’d have thought by now there’d be more discussion re: suite implementation and use, because, imho, they serve as rez’s built-in support for many of the use-cases being discussed here.

That being said, Rez suites are an abstraction, logically and physically, from purely “run-time”, interactive / dynamic resolves — i.e., there’s some overhead in management — but not that much overhead. I feel like something like “dotenv” or similar could provide a quick-n-dirty means to hierarchically redirect $PATH to the desired production suite.

I guess my question is — are suites still supported? Are they just not used / well understood? Are more examples / expanded documentation needed, or is it not really worth the effort?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/nerdvegas/rez/issues/637?email_source=notifications&email_token=AAMOUSQEMA6PW2AAEHT3IVDPZ765PA5CNFSM4HP3E6QKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODXOJ2HA#issuecomment-500997404, or mute the thread https://github.com/notifications/unsubscribe-auth/AAMOUSSPTHHQR3BWXLTU6STPZ765PANCNFSM4HP3E6QA .

mottosso commented 5 years ago

I had not considered suites for this, and I can't quite see how they apply, but will admit my experience with them is minor. @zachlewis Would you be able to provide an example? Are we still talking about users overruling a version?