Open ianonymousdev opened 4 years ago
I managed to get around this issue myself by writing custom JavaScript to load json payload instead of from a csv file. I used config.processor along with beforeScenario hook to define the my custom logic.
For anyone who may be facing the similar problem, here is my solution:
config:
target: "https://api-ap-southeast-2.aws.my-domain.com"
processor: "./post-body.js"
# Following phases test a scenario where 0 tps ramps up to 50 tps in 1 minutes, and then ramps up to 1000 tps every 30 seconds in 50 tps increment
phases:
-
duration: 60
arrivalRate: 10
rampTo: 50
-
duration: 30
arrivalRate: 50
-
duration: 30
arrivalRate: 100
-
duration: 30
arrivalRate: 150
-
duration: 30
arrivalRate: 200
-
duration: 30
arrivalRate: 250
-
duration: 30
arrivalRate: 300
-
duration: 30
arrivalRate: 350
-
duration: 30
arrivalRate: 400
-
duration: 30
arrivalRate: 450
-
duration: 30
arrivalRate: 500
-
duration: 30
arrivalRate: 550
-
duration: 30
arrivalRate: 600
-
duration: 30
arrivalRate: 650
-
duration: 30
arrivalRate: 700
-
duration: 30
arrivalRate: 750
-
duration: 30
arrivalRate: 800
-
duration: 30
arrivalRate: 850
-
duration: 30
arrivalRate: 900
-
duration: 30
arrivalRate: 950
-
duration: 270
arrivalRate: 1000
defaults:
headers:
x-api-key: "fake-x-api-key"
Content-Type: "application/json"
plugins:
cloudwatch:
namespace: "my-service-name"
influxdb:
testName: "my-service Load Test Results"
influx:
host: "fake-ip-address"
username: "fake-username"
password: "fake-password"
database: "influx"
scenarios:
- name: my-service-name load test with varying load
beforeScenario: generatePostBody
flow:
- post:
url: "/my-fake-endpoint"
json:
"{{ data }}"
Following post-body.js contains my custom JS logic. I have introduced a new txt file post-data.txt which essentially replaces csv file which I mentioned in the question to host thousands of rows where each row is a request payload as json. Every time a scenario is executed, a random json payload string is picked up, converted to json object and sent as part of POST request. I am also using CloudWatch and InfluxDB to output the results.
const fs = require("fs");
const filePath = "./post-data.txt";
let postData;
/**
* Generates post body
*/
const generatePostBody = async (userContext, events, done) => {
try{
// add variables to virtual user's context:
if(postData === undefined || postData === null || postData.length === 0) {
postData = await loadDataFromTxt(filePath);
}
const postBodyStr = postData[Math.floor(Math.random()*postData.length)];
userContext.vars.data = JSON.parse(postBodyStr);
// continue with executing the scenario:
return done();
} catch(err) {
console.log(`Error occurred in function generatePostBody. Detail: ${err}`);
throw(err);
}
}
/**
* Loads post body from csv file
* @param {object} filePath - The path of csv file
*/
const loadDataFromCsv = async filePath => {
const data = [];
return new Promise((resolve, reject) => {
fs.createReadStream(filePath)
.pipe(csv({delimiter: '||'}))
.on("data", data => data.push(data))
.on("end", () => {
return resolve(data);
})
.on("error", error => {
return reject(error);
});
});
};
/**
* Loads post body from text file
* @param {object} filePath - The path of text file
*/
const loadDataFromTxt = async (path) => {
return new Promise((resolve, reject) => {
fs.readFile(path, 'utf8', function (err, data) {
if (err) {
reject(err);
}
resolve(data.toString().split("\n"));
});
});
}
// Load data from txt file once at the start of the execution
// and save the results in a global variable
(async () => {
try {
postData = await loadDataFromTxt(filePath);
//console.log(JSON.parse(postData[0]));
} catch (error) {
console.log(`Error occurred in main. Detail: ${err}`);
}
})();
module.exports.generatePostBody = generatePostBody;
{ "profile":{"name":"irfan","email":"irfan@email.com"},"address":["address1","address2"]}
{ "profile":{"name":"Tomas","email":"tomas@email.com"},"address":["address1","address2"]}
{ "profile":{"name":"Joel","email":"joel@email.com"},"address":["address1","address2"]}
HTH
hi @ianonymousdev, something like this should work:
payload:
- path: post-data.csv
delimiter: "~"
fields:
- column1
options:
quote: ""
I'd suggest running your scripts with artillery
locally first - that way you will see an actual error message rather than a generic error message from AWS Lambda.
Thanks @hassy. I have managed to solve this with a slightly different approach.
Hello @ianonymousdev , Is it possible for you to share the approach you mentioned above.
Hi @neerajg5, I used @irfanatopt's solution to solve my problem.
Hi @neerajg5, I used @irfanatopt's solution to solve my problem.
thanks @ianonymousdev, I too was able to follow @irfanatopt's solution. thanks to @irfanatopt
@irfanatopt's answer worked for me, i just had to change
resolve(data.toString().split("\n"));
into
resolve(data.toString().split("\n").filter(s != ''));
i guess ymmv based on line breaks you use in your files (\n
vs \r
)
I've written a small plugin to do it on payload: artillery-plugin-json-include
I've written a small plugin to do it on payload: artillery-plugin-json-include
Thanks man! I am using your plugin. I was having issues because even though I implemented JS methods and everything, my 60,000 lines json would not load into the body... until I used your plugin.
Thanks man!
My lambda function which I want to test expects a complex json and as per my understanding it needs to go into the csv file. My problem is I have tried various ways to load the json from csv but keep getting errors. I am not sure if this is even possible. My sample csv document looks like:
column1 { "profile":{"name":"irfan","email":"irfan@email.com"},"address":["address1","address2"]} { "profile":{"name":"Tomas","email":"tomas@email.com"},"address":["address1","address2"]} { "profile":{"name":"Joel","email":"joel@email.com"},"address":["address1","address2"]}
I only have one column because all I want is this json document to get passed as request body to my hello world lambda so I could load test it.
My Artillery script file looks like:
When I put double quotes around keys and values in the json I get error saying "Error executing task: ERROR exception encountered while executing load from 1579692773908 in 1579692773908: Artillery exited with non-zero code:"
Is there any way to load the json from csv in a way that my hello world lambda function receive request body as json in the following format:
{ "data": { "profile":{"name":"irfan","email":"irfan@email.com"},"address":["address1","address2"]}}
Any help would be appreciated.