nestorsalceda / mamba

The definitive testing tool for Python. Born under the banner of Behavior Driven Development (BDD).
http://nestorsalceda.github.io/mamba
MIT License
518 stars 65 forks source link

Metaprogramming Mamba? #111

Open mstrauss opened 6 years ago

mstrauss commented 6 years ago

Is it somehow possible to DRY up ones specs and generate dynamic examples using an approach like shown in https://chris-lamb.co.uk/posts/generating-dynamic-python-tests-using-metaclasses for unittest? Or is there even a simpler way to do it? If there is not, there really should be one :-)

nestorsalceda commented 6 years ago

Hey!

I'm not sure about this, I perform several AST transformations before executing any code. Anyway, I should think a bit more about parametrized tests, because there are more people asking for it.

Thanks!

nielsbuus commented 5 years ago

Discovered this interesting project today and doing a spike. Right now I loop inside my it and do multiple expects. However, the failing message is "one of these failed" rather than "that one failed". This can be sort of remedied by implementing your own assertion function. Like this:

def crummy_equal(left, right, error):
    if left != right:
        raise RuntimeError(error)

And in the spec

 with it("returns true for all validators"):
     for validator in some_validators:
        crummy_equal(validator.is_valid(), True, "Validation did not pass for {0} validator.".format(validator.name)

Now when you run the specs in documentation mode, you get something like

✗ returns true for all validators
  Validation did not pass for uniqueness validator.

In addition to being my own naive implementation, this code also suffers from the fact that mamba stops on the first error, so if all my imaginary validators are broken, I'll have to fix them one by one, instead of getting a pretty list with crosses and checkmarks for each validator.

I hope @nestorsalceda comes up with something nice. Mamba feels so satisfyingly familiar when coming from RSpec.