Open maxvt opened 8 years ago
@segiddins, thank you for the pointer. I have tried the fix in #45 and it does not completely resolve the issue - now I get a different conflict, with fewer steps until failure compared to the previous code:
- `ohai (= 1.1.12)` required by `user-specified dependency`
- `ohai (>= 0.0.0)` required by `sysctl-0.7.0`
- `ohai (~> 4.0)` required by `sysctl-0.7.0`
The last one is a phantom dependency, it does not exist in the input data. At a first glance this can be verified by going to https://supermarket.chef.io/cookbooks/sysctl/versions/0.7.0#dependencies
@segiddins, the problem seems to be that the possibility being swapped out has different requirements from the possibility being swapped in, but those requirement lists are simply combined.
If fixup_swapped_children
is the correct place to do that processing, it seems to me there are two issues with the current code:
fixup_swapped_children
Creating possibility state for sysctl (>= 0.0.0) (24 remaining)
Attempting to activate sysctl-0.8.0
Activated sysctl at sysctl-0.8.0
Requiring nested dependencies (ohai (~> 4.0))
...
Creating possibility state for sysctl (= 0.7.0) (1 remaining)
Attempting to activate sysctl-0.7.0
Found existing spec (sysctl-0.8.0)
Fixing up swapped children for (sysctl)
@segiddins, what do you think of the following? This fixes my problem, but I'm not sure if there are unintended side effects. Based on your change #45.
def fixup_swapped_children(vertex)
payload = vertex.payload
deps = dependencies_for(payload)
dep_names = dependencies_for(payload).map(&method(:name_for))
debug(depth) { "Fixing up swapped children for (#{vertex.name})" }
vertex.outgoing_edges.each do |outgoing_edge|
@parent_of[outgoing_edge.requirement] = states.size - 1
succ = outgoing_edge.destination
if !deps.include? (outgoing_edge.requirement) && !succ.root? && succ.predecessors.to_a == [vertex]
debug(depth) { "Removing orphaned spec #{succ.name} after swapping #{name}" }
succ.requirements.each { |r| @parent_of.delete(r) }
activated.detach_vertex_named(succ.name)
all_successor_names = succ.recursive_successors.map(&:name)
requirements.delete_if do |requirement|
requirement_name = name_for(requirement)
(requirement_name == succ.name) || all_successor_names.include?(requirement_name)
end
end
end
end
Requirements satisfied, combining and pushing state (packagecloud-0.2.5)
Creating possibility state for sysctl (>= 0.0.0) (24 remaining)
Attempting to activate sysctl-0.8.0
Found existing spec (sysctl-0.8.0)
Requirements satisfied, combining and pushing state (sysctl-0.8.0)
Creating possibility state for sysctl (= 0.7.0) (1 remaining)
Attempting to activate sysctl-0.7.0
Found existing spec (sysctl-0.8.0)
Dependencies for current vertex sysctl.0.8.0:
ohai ~> 4.0
Dependencies for possibility sysctl.0.7.0:
ohai >= 0.0.0
Fixing up swapped children for (sysctl)
Removing orphaned spec ohai after swapping sysctl
Activated sysctl at sysctl-0.7.0
Requiring nested dependencies (ohai (>= 0.0.0))
I found this chunk of debug code to be very helpful in seeing what's going on:
def attempt_to_swap_possibility
activated.tag(:swap)
vertex = activated.vertex_named(name)
debug(depth) { "Dependencies for current vertex #{name}.#{vertex.payload.version}:"}
vertex.payload.dependencies.each do |d|
debug(depth) { "#{d.name} #{d.constraint}"}
end
debug(depth) { "Dependencies for possibility #{name}.#{possibility.version}:"}
possibility.dependencies.each do |d|
debug(depth) { "#{d.name} #{d.constraint}"}
end
@maxvt can you please make a PR with a test case so we can ensure we don't regress? Thanks!
diff --git a/lib/molinillo/resolution.rb b/lib/molinillo/resolution.rb
index d92b09d..1120cc7 100644
--- a/lib/molinillo/resolution.rb
+++ b/lib/molinillo/resolution.rb
@@ -356,11 +356,12 @@ module Molinillo
# @return [void]
def fixup_swapped_children(vertex)
payload = vertex.payload
- dep_names = dependencies_for(payload).map(&method(:name_for))
+ deps = dependencies_for(payload).group_by(&method(:name_for))
vertex.outgoing_edges.each do |outgoing_edge|
@parent_of[outgoing_edge.requirement] = states.size - 1
succ = outgoing_edge.destination
- if !dep_names.include?(succ.name) && !succ.root? && succ.predecessors.to_a == [vertex]
+ matching_deps = Array(deps[succ.name])
+ if matching_deps.empty? && !succ.root? && succ.predecessors.to_a == [vertex]
debug(depth) { "Removing orphaned spec #{succ.name} after swapping #{name}" }
succ.requirements.each { |r| @parent_of.delete(r) }
activated.detach_vertex_named(succ.name)
@@ -371,7 +372,10 @@ module Molinillo
requirement_name = name_for(requirement)
(requirement_name == succ.name) || all_successor_names.include?(requirement_name)
end
+ elsif !matching_deps.include?(outgoing_edge.requirement)
+ outgoing_edge.requirement = matching_deps.first
end
+ matching_deps.delete(outgoing_edge.requirement)
end
end
Ought to fix it, but I'm hesitant to commit without a failing test
@maxvt ping on a failing Molinillo test case for this?
@segiddins Sorry, it took me a while to find some free time and learn how Molinillo testing works. I modified an existing test case and added a new package with the same name, that scenario looks like it manages to hit this problem. Here's a log with a bunch of debugging output thrown in:
Creating possibility state for build-essential (~> 2.0.0) (1 remaining)
Attempting to activate build-essential (2.4.0)
Found existing spec (build-essential (3.2.0))
Dependencies for current vertex build-essential.3.2.0:
seven_zip (>= 0.0.0)
some_package (>= 1.0.0)
Dependencies for possibility build-essential.2.4.0:
7-zip (>= 0.0.0)
some_package (~> 0.1.0)
Fixing up swapped children for (build-essential)
Removing orphaned spec seven_zip after swapping build-essential
Deleting requirement seven_zip (>= 0.0.0) from parent_of
Considering requirement yum-epel (~> 0.3.0)
Activated build-essential at build-essential (2.4.0)
Requiring nested dependencies (7-zip (>= 0.0.0), some_package (~> 0.1.0))
Creating possibility state for some_package (~> 0.1.0) (1 remaining)
Attempting to activate some_package (0.1.0)
Found existing spec (some_package (1.0.0))
Dependencies for current vertex some_package.1.0.0:
Dependencies for possibility some_package.0.1.0:
Rewinding
Unsatisfied by existing spec (some_package (1.0.0))
Unwinding for conflict: some_package (~> 0.1.0)
Thanks, I'm looking into it now and I think I understand what's going on. No promise when I can figure it out, though
diff --git a/spec/spec_helper/index.rb b/spec/spec_helper/index.rb
index 9625e9e..16f807e 100644
--- a/spec/spec_helper/index.rb
+++ b/spec/spec_helper/index.rb
@@ -32,6 +32,7 @@ module Molinillo
dependency.satisfied_by?(spec.version)
end
end
+ @search_for[dependency].dup
end
def name_for(dependency)
Not sure what that change means - was the fixture (TestIndex) bad?
Doesn't look like it picks the optimal version of nginx, definitely not what the test expects to happen. What forces it to 0.2.0?
The test passed with that diff
I pulled the latest master and the test is still failing for me. I do not know why it passes for you. I tried switching the test index to ordering specified by BundlerIndex -- is the result any different?
Please merge the PR that I sent to your fork
Hi, I'm a Berkshelf user facing an issue where a resolution fails due to a constraint that does not actually exist in the solver input. For the background, please see https://github.com/berkshelf/solve/issues/62. The rest is Molinillo-specific investigation.
This is the solver run, you can see that the first version tried is homebrew-2.1.2, and just before performing the swap the payload has a single dependency, build-essential >= 2.1.2:
Ok, step into fixup_swapped_children, we see that the successor (build-essential) is not removed, since other cookbooks depend on it:
After the swap, you can see the new possibility selected has no dependencies:
but the requirement introduced by the possibility that was swapped out is not removed (it was not done in fixup_swapped_children since the successor is not an orphan, and I don't see related logic anywhere else...):
so that (phantom) requirement for build-essential 2.1.2 sticks around, eventually a (real) conflicting requirement is introduced, and resolution fails:
I don't understand the logic of Molinillo enough to understand what's the correct solution. Presumably, requirements that are no longer needed (present in possibility that was swapped out, but absent in the new possibility) need to be removed; but how to find the correct requirements to remove, whether a new state needs to be pushed or rewound as a result of the changing requirements, and so on are unclear to me.
This introduces all kinds of weird behavior up the stack, issues that are resolved by adding version pins, removing pins, all kinds of placebo solutions that do not actually address what seems to be the actual problem at this level. Any help would be greatly appreciated.