I'm looking to implement back-pressure on a project I'm working on but I noticed that the default behavior in this repository is to assume the queue concurrency is not going to change in the middle of a queue batch process.
Here is an example to illustrate:
const queue = require('queue');
const q = queue();
let count = 0;
const job = () => {
return new Promise((resolve, reject) => {
console.log(`start job ${++count}`);
setTimeout(() => {
console.log('finish, q length = ' + q.length);
if (count === 3) {
// increase concurrency after 3 seconds
q.concurrency = 4;
} else if (count === 16) {
// add 10 jobs to the queue when 16 items are processed
q.push(...new Array(16).fill(job));
} else if (count === 17) {
// then reduce concurrency back to 1 (back-pressure)
q.concurrency = 1;
}
resolve();
}, Math.round(Math.random()*1000) + 500);
});
};
for (let i = 0; i < 19; ++i) {
q.push(job);
}
// Start with concurrency of 1/sec
q.concurrency = 1;
q.start((err) => {
if (err) {
console.error(err);
} else {
console.log('yay');
console.log(q);
}
});
Note that after adding some items to the queue and reducing concurrency back down to 1 (at count=17), we get some weird behaviour where all remaining batches are fired in one go.
Instead, I would expect the queue to continue processing at 1 per second until its finshed.
I'm looking to implement back-pressure on a project I'm working on but I noticed that the default behavior in this repository is to assume the queue concurrency is not going to change in the middle of a queue batch process.
Here is an example to illustrate:
The output is as follows:
Note that after adding some items to the queue and reducing concurrency back down to 1 (at count=17), we get some weird behaviour where all remaining batches are fired in one go.
Instead, I would expect the queue to continue processing at 1 per second until its finshed.