postmanlabs / newman

Newman is a command-line collection runner for Postman
https://www.postman.com
Apache License 2.0
6.88k stars 1.17k forks source link

Running 100000 iterations of a single POST call seems to break the reporter #935

Open jbnok opened 7 years ago

jbnok commented 7 years ago

Version and environment information:

  1. Newman Version (can be found via newman -v): 3.4.3
  2. OS details (type, version, and architecture): RHEL Linux
  3. Are you using Newman as a library, or via the CLI? CLI
  4. Did you encounter this recently, or has this bug always been there: Recently
  5. Expected behaviour: I should get a test report like I did when I did this with 10000
  6. Command / script used to run Newman: _(This is from a Jenkins Execution) /usr/local/lib/node_modules/newman/bin/newman.js run ${COLLECTION_NAME} -e ${ENVIRONMENTNAME} -d serverInfo.csv -n ${COUNT} -r html,cli,json,junit --insecure
  7. Sample collection, and auxilliary files (minus the sensitive details): All I can give you is the exception i see at the end of the console log
    
    usr/local/lib/node_modules/newman/lib/reporters/json/index.js:17
            content: JSON.stringify(_.omit(o.summary, 'exports'), 0, 2)
                          ^

RangeError: Invalid string length at JSON.stringify () at EventEmitter. (/usr/local/lib/node_modules/newman/lib/reporters/json/index.js:17:27) at EventEmitter.emit (/usr/local/lib/node_modules/newman/node_modules/eventemitter3/index.js:151:33) at done (/usr/local/lib/node_modules/newman/lib/run/index.js:199:29) at /usr/local/lib/node_modules/newman/node_modules/postman-runtime/lib/backpack/index.js:56:34 at PostmanCollectionRun._process (/usr/local/lib/node_modules/newman/node_modules/postman-runtime/lib/runner/run.js:112:20) at PostmanCollectionRun. (/usr/local/lib/node_modules/newman/node_modules/postman-runtime/lib/runner/run.js:123:18) at ontimeout (timers.js:365:14) at tryOnTimeout (timers.js:237:5) at Timer.listOnTimeout (timers.js:207:5)


**_Steps to reproduce the problem:_**

1. Run a newman job with a CSV file and 100K instances of a single call

<!--
Before proceeding, please consider the following:

1. If this is a Postman Client App issue, go to https://github.com/postmanlabs/postman-app-support/issues/new

2. Report security issues directly to security@getpostman.com. Please refrain from publicly disclosing any details about security issues in any form, until a fix has been rolled out.

3. For Postman Cloud / personal information related issues, send an email to: [help@getpostman.com](mailto:help@getpostman.com)

4. Verify that you have checked the documentation at https://www.getpostman.com/docs/newman_intro, and searched through previously closed issues.

5. Please ensure that all supporting files (collections or screenshots) do not contain any sensitive information.
-->
czardoz commented 7 years ago

Hmm, this is because the output report has too much of data. This can only be solved by creating a streaming JSON reporter, using something like https://www.npmjs.com/package/stream-json

jbnok commented 7 years ago

This is something anyone that's going to performance test an API is going to try... This should be supported out of the box (especially when we're not using this programatically... which is a requirement for the configuration you're suggesting).

This should and must be supported out of the box.

czardoz commented 7 years ago

@jbnok, the current reporters are not designed for load testing scenarios, but I can see why it will be useful :+1:

Paramagnetic commented 7 years ago

I wrote a reporter that basically just cuts out a bunch of stuff. It cut what was a 92.5 MB report (in the standard JSON reporter) down to 315 KB, while keeping what I consider the important data. This doesn't use a streamer, still JSON.stringify, but I haven't run into the Invalid String Length issue since switching to it. Hopefully this is useful to other people as well: https://www.npmjs.com/package/newman-reporter-json-light

awaisabir commented 7 years ago

Was running into the same issue. I'll try your reporter out @Paramagnetic !

sandello-alkr commented 7 years ago

Same issue here... Would be useful to be able to filter out successful runs.

xiaweixing commented 6 years ago

@Paramagnetic Recently we also got this same issue, and considering there are more people experience the same, have you considered to create a pull request to submit your solution to postman repo? Thank you.

shamasis commented 6 years ago

This should be resolved as we add reported flags to trim request and response payloads. But I really like the light JSON reporter. Will recommend your reporter meanwhile. @Paramagnetic

Michal-Kolomanski commented 5 years ago

I have installed @Paramagnetic json-light-reporter but how am I supposed to use that? I placed it in newman/lib/reporters and trying: "newman run collection.json --globals globals.json -r newman-reporter-json-light" in the bash console. But "please install reporter using npm" information shows after Enter.

Edit: Okay, now I have installed global json-light and it's possible to run a collection. But after finished run, an error appears: collectionFolder.items.members.forEach(function(folderItem) {
TypeError: Cannot read property 'members' of undefined

Moreover, that doesn't solve the problem with collecting data by node process. Unfortunately, memory occupation by node process is still increasing (to "Javascript heap out of memory")

I would be delighted if there will be an option to NOT store any data and just keep sending an infinite number of queries.

Michal-Kolomanski commented 5 years ago

This should be resolved as we add reported flags to trim request and response payloads. But I really like the light JSON reporter. Will recommend your reporter meanwhile. @Paramagnetic

Hello @shamasis, is there any situation update? Is possible to run plenty of requests in single run (limitless run)?

cainzhong commented 3 years ago

I use big-json to generate the summary.json instead of using JSON.stringify(). It works. The code looks like below.

let bigJson = require('big-json')
function saveBigObjectAsFile(bigObject, filePath) {
    const stringifyStream = bigJson.createStringifyStream({
        body: bigObject
    })
    let writeStream = fs.createWriteStream(filePath)
    return new Promise((resolve) => {
        stringifyStream.on('data', (chunk) => {
            writeStream.write(chunk)
        })
        stringifyStream.on('end', () => {
            logger.log('info', 'All data has been saved to %s', filePath)
            resolve()
        })
    })
}
vbakshi123-star commented 2 years ago

@

This should be resolved as we add reported flags to trim request and response payloads. But I really like the light JSON reporter. Will recommend your reporter meanwhile. @Paramagnetic

This is still an issue while running it in Azure Dev Ops. newman version 5.3.2 running only 2500 iterations.