Closed ufukguenes closed 1 year ago
Hi @ufukguenes, I am currently in vacations and will have a look after my return next week.
Maybe @Osburg can help in the meantime...
Have you tried to find out which constraint is responsible for this behaviour by commenting out some of them ...
Hi @ufukguenes,
from doe perspective there is no specific reason why we need unique samples. However, PolytopeSampler
seems to think this is important, which also makes sense in a way that the user does not expect to get the same samples over and over again when calling ask()
.
Your example does not throw an error on my machine if I remove max_3_constraint
. Otherwise the sample only consists of 19 times the sample [1., 0., 0., 0., 0., 0., 0., 1., 0., 1.]. I haven't dived deep into the implementation of PolytopeSampler
, so I do not know why this exact combination of constraints causes problems for the polytope sampler. But it should be safe to remove max_3_constraint
since it is uninformative anyways, right? (max_3_constraint
is automatically fulfilled, if the remaining three mixture constraints are fulfilled, if I see this correctly).
cheers
Aaron :)
Alternatively, relaxing max_3_constraint
to $\sum features \leq 3.01$ also does the job.
P.S.: Should these element names in your example be accessible to the public? :D
Hey @Osburg,
P.S.: Should these element names in your example be accessible to the public? :D
Thanks for bringing this to my attention. I edited the comment and deleted the old version.
But it should be safe to remove max_3_constraint since it is uninformative anyways, right?
Yes, it is uninformative. This is a sub-problem of another problem I was trying to solve and there max_3_constraint
is necessary, however in the other problem I don't get the error. So maybe it has something to do with redundant information.
Regrading the issue, do you think it is a good idea to give a warning instead of a error?
Hmm, good question. The comments where the error is raised seem to me as if this error was built in specifically for the case we are facing here... So maybe we wait for @jduerholt or @evo-bschiller to give an opinion, probably they had good reasons for raising an error when they implemented PolytopeSampler
?
Alternatively, if we keep raising an error in polytope sampler we could apply the doe-internal fallback to uniform sampling also to cases where polytope sampling throws an error. What do you think about this?
Yes, let's wait. Using the fallback method also seems like a valid idea. This however depends on what the original intention of the error was.
Hi all,
I am back from vacation and had another look on the problem. @Osburg thanks for already investigating it and identifying the issue with the redundant information due to the max_3_constraint
.
Under the hood we are using the polytope sampler from botorch. I was rebuilding the example directly in it and as expected we get the same issue of always having the same sample:
import torch
from botorch.utils.sampling import get_polytope_samples
get_polytope_samples(
n=5,
bounds=torch.tensor([[0. for i in range(10)], [1. for i in range(10)]]),
equality_constraints=[
(torch.tensor([i for i in range(6)], dtype=torch.long), torch.ones(6), 1),
(torch.tensor([6,7], dtype=torch.long), torch.ones(2), 1),
(torch.tensor([8,9], dtype=torch.long), torch.ones(2), 1),],
inequality_constraints = [
(torch.tensor([i for i in range(10)], dtype=torch.long), -1*torch.ones(10), -3.)
]
)
So it seems to be a problem with the sampler in botorch, arising when redundant information is provided. I will create an issue there.
As @Osburg, already mentioned, the fix would be to just remove the last constraint.
Looking at the error, it appears to me that I have seen the behavior already before and wanted to catch it in this way, because doing uniform random sampling from a polytope that creates only one sample make no sense to me. But for me it is fine to relax the error to a warning.
Best,
Johannes
I am using the PolytopeSampler for finding an initial guess to start the optimization with find_local_max_ipopt. However, i receive an error: "ValueError: Generated candidates are not unique!". I have two questions regarding this problem. Why do the the samples have to be unique in the first place? The PolytopeSampler has a fallback sampling method. Shouldn't the fallback method be called if the PolytopeSampler fails, why isn't it called?
@evo-bschiller and @jduerholt can one of you explain anything regarding this issue?
This code should replicate the error.