Closed ibc closed 8 months ago
CI failing here, interesting: https://github.com/versatica/mediasoup/actions/runs/7412001908/job/20167817650?pr=1298
It doesn't fail in macOs, only in Linux. This is the failing test with "InvalidStateError: Channel closed":
test('Consumer emits "score"', async () =>
{
const audioConsumer = await ctx.transport2!.consume(
{
producerId : ctx.audioProducer!.id,
rtpCapabilities : ctx.consumerDeviceCapabilities
});
// Private API.
const channel = audioConsumer.channelForTesting;
const onScore = jest.fn();
audioConsumer.on('score', onScore);
// Simulate a 'score' notification coming through the channel.
const builder = new flatbuffers.Builder();
const consumerScore = new FbsConsumer.ConsumerScoreT(9, 10, [ 8 ]);
const consumerScoreNotification = new FbsConsumer.ScoreNotificationT(consumerScore);
const notificationOffset = Notification.createNotification(
builder,
builder.createString(audioConsumer.id),
Event.CONSUMER_SCORE,
NotificationBody.Consumer_ScoreNotification,
consumerScoreNotification.pack(builder)
);
builder.finish(notificationOffset);
const notification = Notification.getRootAsNotification(
new flatbuffers.ByteBuffer(builder.asUint8Array())
);
channel.emit(audioConsumer.id, Event.CONSUMER_SCORE, notification);
channel.emit(audioConsumer.id, Event.CONSUMER_SCORE, notification);
channel.emit(audioConsumer.id, Event.CONSUMER_SCORE, notification);
expect(onScore).toHaveBeenCalledTimes(3);
expect(audioConsumer.score).toEqual(
{ score: 9, producerScore: 10, producerScores: [ 8 ] });
}, 2000);
NOTE: It doesn't fail in Docker Ubuntu 22.04 in my host.
UPDATE: Ignore this comment.
I'll make an important question, maybe too late:
Is it guaranteed that all Node tests in same file run one after each other from top to the bottom? All the TestContext
stuff is done assuming that.
test()
in same file in parallel?const ctx: TestContext
for each test rather than having a global one.AUTO-ANSWER: It's guaranteed that all tests within a file run sequentially. So false alarm. Ignore this entire comment.
Interestingly, a similar test in test-Producer()
does reliably work:
test('Producer emits "score"', async () =>
{
const videoProducer =
await ctx.transport2!.produce(ctx.videoProducerParameters);
// Private API.
const channel = videoProducer.channelForTesting;
const onScore = jest.fn();
videoProducer.on('score', onScore);
// Simulate a 'score' notification coming through the channel.
const builder = new flatbuffers.Builder();
const producerScoreNotification = new FbsProducer.ScoreNotificationT(
[
new FbsProducer.ScoreT(
/* encodingIdx */ 0,
/* ssrc */ 11,
/* rid */ undefined,
/* score */ 10
),
new FbsProducer.ScoreT(
/* encodingIdx */ 1,
/* ssrc */ 22,
/* rid */ undefined,
/* score */ 9
)
]);
const notificationOffset = Notification.createNotification(
builder,
builder.createString(videoProducer.id),
Event.PRODUCER_SCORE,
NotificationBody.Producer_ScoreNotification,
producerScoreNotification.pack(builder)
);
builder.finish(notificationOffset);
const notification = Notification.getRootAsNotification(
new flatbuffers.ByteBuffer(builder.asUint8Array())
);
channel.emit(videoProducer.id, Event.PRODUCER_SCORE, notification);
channel.emit(videoProducer.id, Event.PRODUCER_SCORE, notification);
channel.emit(videoProducer.id, Event.PRODUCER_SCORE, notification);
expect(onScore).toHaveBeenCalledTimes(3);
expect(videoProducer.score).toEqual(
[
{ ssrc: 11, rid: undefined, score: 10, encodingIdx: 0 },
{ ssrc: 22, rid: undefined, score: 9, encodingIdx: 1 }
]);
}, 2000);
It's random, sometimes it works on Ubuntu 22.04 and then it fails on Ubuntu 20.04:
So it fails even like this:
test('Consumer emits "score"', async () =>
{
const audioConsumer = await ctx.transport2!.consume(
{
producerId : ctx.audioProducer!.id,
rtpCapabilities : ctx.consumerDeviceCapabilities
});
// Private API.
// const channel = audioConsumer.channelForTesting;
// const onScore = jest.fn();
// audioConsumer.on('score', onScore);
// Simulate a 'score' notification coming through the channel.
const builder = new flatbuffers.Builder();
const consumerScore = new FbsConsumer.ConsumerScoreT(9, 10, [ 8 ]);
const consumerScoreNotification =
new FbsConsumer.ScoreNotificationT(consumerScore);
const notificationOffset = Notification.createNotification(
builder,
builder.createString(audioConsumer.id),
Event.CONSUMER_SCORE,
NotificationBody.Consumer_ScoreNotification,
consumerScoreNotification.pack(builder)
);
builder.finish(notificationOffset);
// const notification = Notification.getRootAsNotification(
// new flatbuffers.ByteBuffer(builder.asUint8Array())
// );
// channel.emit(audioConsumer.id, Event.CONSUMER_SCORE, notification);
// channel.emit(audioConsumer.id, Event.CONSUMER_SCORE, notification);
// channel.emit(audioConsumer.id, Event.CONSUMER_SCORE, notification);
// expect(onScore).toHaveBeenCalledTimes(3);
// expect(audioConsumer.score).toEqual(
// { score: 9, producerScore: 10, producerScores: [ 8 ] });
}, 2000);
This doesn't make any sense.
BTW the stderr of the test says:
● Consumer emits "score"
InvalidStateError: Channel closed
399 | close : () =>
400 | {
> 401 | pReject(new InvalidStateError('Channel closed'));
| ^
402 | }
403 | };
404 |
at Object.close (node/src/Channel.ts:401:14)
at Channel.close (node/src/Channel.ts:232:9)
at Worker.close (node/src/Worker.ts:570:17)
at Object.<anonymous> (node/src/test/test-Consumer.ts:277:14)
Line 277 in test-Consumer.ts
is this one:
afterEach(() =>
{
ctx.worker?.close(); // <--- line 277
});
It looks like `worker.close()` is **crashing**??? because worker was already closed (due to a crash) but Node didn't know yet???
Main difference between this branch and v3 in test-Consumer.ts
is that in v3 we create a unique worker for the whole test file which in this branch we create a separate worker per test()
and close it in afterEach
hook.
I would run the tests locally, in Docker, and enable logs, filter tests to dig further. I can't say from the top of my head what may be happening, but it indeed seems there may be a crash? and that's why the worker is closed?
I would run the tests locally, in Docker, and enable logs, filter tests to dig further. I can't say from the top of my head what may be happening, but it indeed seems there may be a crash? and that's why the worker is closed?
As I said above I cannot reproduce in Docker. BTW: https://github.com/versatica/mediasoup/actions
I'll do one more thing.
So the problem was that an unhandled rejection in a random test makes Jest fail later in any other random test.
Tests adapted: