svanoort / pyresttest

Python Rest Testing
Apache License 2.0
1.15k stars 326 forks source link

Enhancements to better compete with requests/unittest combination #153

Open svanoort opened 8 years ago

svanoort commented 8 years ago

Currently there are a lot of users that are torn between PyRestTest (YAML) and requests/unittest combination (pure Python) for their REST API testing needs.

Story: As a PyRestTest user I would like to have features to make it easier to use PyRestTest within a python environment. I would like to offer a python builder API that is similar to frisby.js for constructing and running tests, something that is in a fluent style but generates test/testset/testconfig objects just like the existing YAML.

There are a couple outstanding issues that would build in this direction:

https://github.com/svanoort/pyresttest/issues/29 - better logging when invoking directly from Python

https://github.com/svanoort/pyresttest/issues/43 - provide xUnit compatible output, preferably by wrapping unittest compatibility.

https://github.com/svanoort/pyresttest/issues/146 - setup/teardown (as part of v2.0)

Current PyRestTest strengths:

Current PyRestTest weaknesses:

Thoughts on what is particular helpful / unhelpful here for your uses? @nitrocode @lerrua @jewzaam (I'm aware you guys were using a legacy fork based off pre 0.x version) @MorrisJobke @jeanCarloMachado

Edit: Scope limits

Unix philosophy: one tool for one task, not One Tool To Rule Them All™ - the goal is to keep PyRestTest focused on what it does well, while letting it grow.

nitrocode commented 8 years ago

Here are some suggestions. Some of these may overlap.

  1. Faster yaml parsing or whatever is taking a while to start the initial curl request
  2. Stronger documentation and many more use cases
  3. I'm having on again and off again issues with pycurl and ssl for some reason
  4. It may be just me, but I'm having trouble wrapping my head around building extractors / validators / generators.
  5. Maybe a use case of requests / unittest2 to verify fields in python and then converting to YAML and the respective benchmarks in comparison would be a good addition.
  6. If pycurl issues continue, maybe add a switch in place to switch from pycurl to requests.
  7. I agree with the logging issues. Perhaps output to TAP or xUnit or integrate with py.test or something else that already has the capability to output to standard outputs

Thank you

svanoort commented 8 years ago

@nitrocode Thanks for your feedback!

  1. Yes, I think it may be worth doing a little benchmarking here (easy to add).
  2. I'd love PRs to help build out documentation. Do you feel like the docs changes here help: https://github.com/svanoort/pyresttest/issues/100 and https://github.com/svanoort/pyresttest/issues/151
  3. Ugh. This one is tricky, because I don't have a lot of familiarity with the different SSL implementations and issues. I know there are some subtle issues around the SSL libs pycurl is built with (I don't have a ref handy, but it's in issues if you look). Unfortunately, I don't have an easy way to replicate or debug these at this time (may be worth building out the functional test harnesses, but requires a lot of different certs & config for the Django test app).
  4. I was hoping the test flow diagram would help a little bit, but what would you like to see in advanced_guide?? I know more examples will be helpful (part of PRs in 2), but am not sure how to better explain this.
  5. Hmm. Will have to think about that one.
  6. Perhaps. I don't want to get too far down that path of comparison (since it's kind of apples-to-oranges comparison here); my main concern is to remove any significant regressions and see if there's optimizations.
  7. Yeah, it's coming, I've kicked some of the dynamic features back again to slip this in the next release.
n9986 commented 8 years ago

I'd like it if this can support the following as a testing tool (not for benchmarks):

Suppose, I have 4 filters (as GET params) in an API endpoint. I want all of them, and their combinations to be tested with their possible values and values which are not allowed.

So if I had filter params:

max_age (integers), name (regex, string), country (regex, string), employed (boolean, true, false)

I want a way to generate all or some combinations of these from a provided list (return value checks are not necessary).

Maybe something like:

- test:
  - combinator: {name: 'filter', type:'form', params: {max_age: [10, 20, 30, 40, 50], name: ['Doe', 'Jane']}}
  - url: '/api/v1/person.json?$filter'

# `type` can be json/xml/form so it could also be used to generate test data for body
# This would in turn create test values for `$filter` like:
# max_age=10&name=Doe
# max_age=20&name=Doe
# ... and so on ... 

This may not always be necessary, because the above example looks like something that a testing suite DB layer itself should handle. But there are other areas where this can be quite helpful (params that have less to do with the database and are some business logic things and may be affected by values of the other). I looked hard in the documentation, but couldn't locate anything that can do this.

Also thank you for the work on this great tool!

svanoort commented 8 years ago

@n9986 Thank you for your feedback! I'll look into how something like this could be incorporated.

svanoort commented 8 years ago

Thought: perhaps the best way to approach this is as a piece-at-a-time utility from the python side.
This is easy enough to build out: rather than trying to replace unittest + requests in pure-python use, offer up the extensible components as ways to decorate & easy test cases.

For example, validation of response bodies/headers, extract data from HTTP responses, do templating/generation on request bodies, etc.

This is complementary to the effort to provide unittest outputs and better python APIs for working with tests/testsets/etc.

jeanCarloMachado commented 8 years ago

Sorry for taking to long to answer, here it goes. I'm from an outer context, only pyresttest runs on python on my project. Anyway there some features that I would like to see in pyresttest.

svanoort commented 8 years ago

@jeanCarloMachado Thank you for your feedback, and feedback at any time is welcome. To give some idea on timelines, the current full roadmap runs about a year out, and I aim for a major release with a couple big changes + several smaller feature additions & bugfixes every 3-6 months.

Between this and the other feedback, I've bumped the priority of xUnit compatible output up several notches, and it is planned for the next release. The dynamic/flexible variable binding got bumped down in priority to allow for this.

Logging is getting some extra love as well with https://github.com/svanoort/pyresttest/pull/171 and I'll factor in ways to improve the signal-to-noise ratio here.

Config files: I've added https://github.com/svanoort/pyresttest/issues/177 to include this. It depends on several planned features in future releases, so it will be a while out, but it's factored into design planning now.

Parallel: yes, I agree this would improve speed a lot. It is a rather complex, multi-step process to get there, but here are the rough design plans. I'm open to ideas or examples if people have good samples that may help:

Happy wall-of-text! :-)

rgarg1 commented 7 years ago

Its been over a year that I saw any activity. @svanoort Are you still actively maintaining this?

Anjimeduri commented 6 years ago

@jeanCarloMachado

nitrocode commented 6 years ago

@rgarg1 last commit to master May 2016 and last commit to refactor-execution April 2017. I don't think this is maintained anymore.