Open nlw0 opened 7 years ago
This is a known limitation of ScalaCheck, shrinking will not always respect the generator bounds.
You can add a pre-condition to you property with ScalaCheck's implication operator (==>
) to avoid this if you like. That would also make it clear that your implementation has a pre-condition that needs to be satisfied before using it.
Thanks. I ended up writing a Shrink and "specializing" my Ints using a case class (because I don't know how to use stuff like refined types), but I am definitely interested in learning how to use the library better and try something like that. I don't mean to be a whiny user, and shrinking actually helped me to find a bug in my code later. I understand it is difficult to implement this, and I've been thinking about the whole problem of generating the data while supporting the constraints. I noticed the whole process, both generating and filtering, could be implemented as a kind of Markov-chain Monte Carlo, where samples are generated by transforming other valid samples instead of always filtering something much more general. You would have a "Mutate" like a "Shrink", and go on applying transforms that would be more likely within the constraints, until you stop and produce a new sample. That might make sampling more efficient in some cases. Have you ever considered something like that?
@rickynils Could you please tell me if this will be fixed to always respect the generator bounds or it is an intended behavior so will remain as is?
This is a known limitation of ScalaCheck, shrinking will not always respect the generator bounds.
@Kevin-Lee I consider it something that should eventually be fixed, but it might require larger changes in ScalaCheck, and no one is currently working on this.
A suggested fix: https://github.com/rickynils/scalacheck/pull/440 (but it might be very controversial)
I run in similar issues and after struggling a little bit I found this thread here. Using forAllNoShrink helped and after the tests were running successfully, I tried again forAll and the issue didn't show up anymore!!! This nondeterministic behavior is really annoying. I hope someone take care of this please.
I tried again with forAll and the issue didn't show up anymore!!!
Yes, it only disables the shrinking on failures. If the test passes, then there's no shrinking.
I am working on a software for logic inference, where I am processing lists of non-zero values. My test looks like:
This always breaks for me, with Scalacheck generating shrinked lists that contain the value 0, which is invalid. If I switch to
forAllNoShrink
all goes well.I was very lost on that for a while, it was quite surprising to see the value '0' in my tests while I was very careful to generate the data without it, and even tried to implement that in different ways. I think shrinking is a great feature, but I got bitten hard this time... Shouldn't there be some more care to prevent something like that?