nick8325 / quickcheck

Automatic testing of Haskell programs.
Other
714 stars 119 forks source link

`maxShrinks` in stdArgs is `maxBound` #224

Closed NorfairKing closed 3 years ago

NorfairKing commented 5 years ago

This means that for a value with a lot of possible shrinking options, tests could take arbitrarily long when a counterexample is found, without ever showing a counterexample.

Could we set this to 1000 or so?

nick8325 commented 5 years ago

Normally I would rather that QuickCheck spend the extra time shrinking, as human time (in decoding an unsimplified counterexample) is more valuable than computer time. Also, some features (InfiniteList, Test.QuickCheck.Function) depend on shrinking running to completion, so I don't think we should cut off shrinking without the user's say-so.

Note that there is now verboseShrinking to see counterexamples during shrinking. This can be used to debug shrinking loops.

NorfairKing commented 5 years ago

@nick8325

Normally I would rather that QuickCheck spend the extra time shrinking

Even if that involves hours of shrinking? As far as I can tell, the number of possible shrinks grows exponentially in the recursive size of your data type, and then exponentially again in the size of the data value. I'm arguing that there is some point beyond which we'd rather have some output than just wait for the example to be shrunk.

^ I could be implementing my shrinking functions incorrectly, but shrinking a Forest Double gets very expensive very quickly.

Side-note. After https://github.com/hspec/hspec/pull/381, I'm happy with setting this option manually whenever necessary. Feel free to close this as "WontFix" or leave it open for discussion, whichever you prefer.

nick8325 commented 5 years ago

Well, there shouldn't be any exponential blowup if shrinking is implemented correctly, although the shrinking for Double perhaps tries too hard. Do you have an example I could look at?

NorfairKing commented 5 years ago

Well, there shouldn't be any exponential blowup if shrinking is implemented correctly,

I respectfully disagree. This works as intended.

https://hackage.haskell.org/package/quickcheck-instances-0.3.19/docs/src/Test.QuickCheck.Instances.Containers.html#line-44

nick8325 commented 5 years ago

Yes, that instance looks good to me. Indeed, when I try Forest Double, the number of shrink candidates is roughly 40*the number of nodes regardless of size - there's no exponential blowup. The problem here is:

NorfairKing commented 5 years ago

@nick8325 If QuickCheck assumes any semantics of shrink, could we make sure that these are documented? For example, does QuickCheck assume that the number of shrinking options is constant in the size of the thing that is being shrunk? (This is what it sounds like, going off your examples from Double and Int.)

nick8325 commented 5 years ago

I don't think that QuickCheck assumes anything in particular about shrink, except that shrinking should eventually terminate. You are free to write a shrink function where the number of alternatives grows e.g. quadratically in the size of the test case, but then of course you may have to wait a while for shrinking to finish... The predefined implementations of shrink try to have only a linear number of shrink candidates, though.

NorfairKing commented 3 years ago

Closing this as wontfix