Closed jcbhmr closed 11 months ago
Yes, because environemnts like Vitest want to handle the error themselves instead of us throwing. We can add an option called throws
, that'd also be good.
check #51
I've been dealing with this the last couple of days. If you look at bench.tasks (for instance, when bench.table() is being called) you can see that some of your tests ran 0 times, and the result contains the error it encountered.
It would be better for everyone who is not us if the tests told you when they are blowing up.
I fixed this in the PR and I'll try to merge soon, sorry for the delay!
The extra checks in table() might fix a problem I'm having. cool cool
Happy to hear that @jdmarshall
I was scratching my head for a good three hours until I did this:
Scenario:
Is there a reason that right now throwing inside benchmarks doesn't cause a catastrophic failure? There must be... right? 😅
I think that it would be a good idea to make it so that either:
Some kind of user feedback that the superfast speeds they're seeing are actually an error is a good idea. 😊