Open vgosai86 opened 6 years ago
Totally makes sense. I came to this page while having the exact same example in mind.
I would like to see this as well... for example when testing response times it would be good to be able to say
1000ms or more = failure 500ms - 1000ms = warning ( tests pass, but flag up at the end of a Newman / manual run for attention ) 500ms or less = pass
It would be a great way to track any dips in performance across API's and enable developers to be proactive in improving performance before it hits the failure stage.
This issue has been open for a while. A similar one was filed as duplicate here https://github.com/postmanlabs/postman-app-support/issues/5915 and was closed as wont-fix.
Like I mentioned in the other issue, this is a genuine requirement, but using test status to achieve this can be ... dicey. There are too many things involved in test status toolchain and can break a number of reporting systems if we do this.
One idea that we are working on, and we would need your feedback around it, is the ability to show console.log console.warn and console.error references that were made while running tests. Say, you do console.warn("response is slow, but not that slow!")
inside a pm.test()
block ... then beside the assertion, show a tiny icon that displays that there were warnings associated with this assertion. Nevertheless, this solution is a bit technically challenging to implement for our present system and other related factors... :fingers-crossed:
I think your suggestion of the console.warn would work for my requirements with a little bit of additional effort in our build solution. Though it would just mean adding in a bit more code than would have been ideal in postman, as we try to keep tests as 'clean' as possible. That being said the majority of times we run tests they're automated with newman and the .xml results are parsed via JUnit, so if the console.warn output is displayed in the .xml results file I'm sure I could manage a way to 'alert' via our build plan. My only concern would be that logs like this tend to largely be ignored other than for debug reasons, so there is a chance that for the majority of users these console.warn logs would likely just get lost in the wouldn't be noticed.
@dantench ... +1 on the fact that logs get ignored. Let us think more on this. Copying @vkaegis .
Essentially the problem statement is:
One solution we discussed is associating console logs from inside tests with the test output itself. The problem is the "perception" of console.logs. We need to think of more solutions that plays well with all toolchain and achieves the purpose of "annotating" test specifications.
This issue has been open for a while. A similar one was filed as duplicate here #5915 and was closed as wont-fix.
Like I mentioned in the other issue, this is a genuine requirement, but using test status to achieve this can be ... dicey. There are too many things involved in test status toolchain and can break a number of reporting systems if we do this.
One idea that we are working on, and we would need your feedback around it, is the ability to show console.log console.warn and console.error references that were made while running tests. Say, you do
console.warn("response is slow, but not that slow!")
inside apm.test()
block ... then beside the assertion, show a tiny icon that displays that there were warnings associated with this assertion. Nevertheless, this solution is a bit technically challenging to implement for our present system and other related factors... :fingers-crossed:
Is this working now? I tried to provide a console.warning inside a pm.test() and run using newman but it didnt displayed the warning . Below is the code snippet if you can help or someone
H:\>newman run Postman/XYZ-Integration.postman_collection.json -e Postman/XYZ-Integration-Test.postman_environment.json --reporters html --reporter-html-export H:/postman/report.html
@shamasis
Is there any movement on this? Resoving this item would give more accurate results for the volume of requests were using postman for. My use case: I would rather not show a failure for trailing white space on an address field, or the use of an abbreviation st.
compared to the word street
.
Thanks
Also came with the exact same example of response time in mind. Any chance this may be re-evaluated? It would be a great feature. @vkaegis
We would also like to have the ability to set a threshold response time that triggers a warning, instead of an outright fail.
It’d good to have ‘Warning’ test status along with Passed/Failed for the test which may be less important. For ex. I want to throw a warning and not fail it if the API response time is more than 5s.