Closed dotarmin closed 1 year ago
@Julusian @ronag do you have any idea what can be causing this?
Solved by setting the ffmpeg.producer.auto-interlace
flag to all in casparcg.config.
We need to find the root cause of this. Even if it works by setting the auto-interlace flag (see previous comment) I'm still curious why this is the case when the file indicates it's interlaced according to it's metadata.
Here is another hint. Tested with casparcg server version [2.3.2 f01b1efa2 Dev] 1080i5994, HDSDI output Decklink driver 11.5.1 Windows 10
Create interlaced mxf clip with ffmpeg (NTSC country) ffmpeg -f lavfi -i testsrc2=r=59.94:size=480x270 -filter_complex scale=1920x1080[v1];[v1]tinterlace=4[v2];[v2]setfield=tff -top 1 -c:v mpeg2video -g 15 -b:v 30000k -bufsize 30000k -maxrate 30000k -flags +ilme+ildct -an interlace.mxf
(PAL country) ffmpeg -f lavfi -i testsrc2=r=50:size=480x270 -filter_complex scale=1920x1080[v1];[v1]tinterlace=4[v2];[v2]setfield=tff -top 1 -c:v mpeg2video -g 12 -b:v 30000k -bufsize 30000k -maxrate 30000k -flags +ilme+ildct -an interlace.mxf
Repeat load and play interlace.mxf in casparcg server
Record playback signal with blackmagic design media express or other recording program.
playback recorded file with vlc without deinterlacing
Suppose field number is start from 0 Expected behaviour :
Wrong result : (You can see abnormal frame)
Sometimes the wrong result comes out when you repeat the load and play, and sometimes the good result comes out. Attachments show good and bad results (frame 10 should be composed with field 10 + field 11, // 10th field number digit should not be interlaced)
I ran some PAL and 1080i5000 tests using server version 2.3.3_LTS (downloaded from the Github distribution). My results match those reported by sendust. The test content was played out of a CasparCG channel via a BlackmagicDesign UltraStudio 4k as HD-SDI, recording the signal into a Blackmagic Hyperdeck Studio Mini which used ProRes422 codec. The recorded videos are imported to FCPX, and single frames or sequences of frames are exported as PNG stills.
I created test signals with interlaced, progressive 25Hz and progressive 50Hz sources. With the CasparCG channel set to PAL or 1080i5000 playing the test footage showed about a 50% probability of the SDI output having bad interlace in which an output frame is composed of field 2 from source frame 1, and the field 1 from source frame 2.
My test videos are available at https://github.com/amwtech/CasparCG_Test_Signals. They are designed to make it easy to identify the temporal source, and to see if spatial offset has been used on a field.
An example with correct interlacing is shown below.
The same test signal played out with wrong interlacing is shown below.
Careful examination of the patterns of lines in the blocks shows the fields are spatially offset from their source positions to give correct temporal sequence. The offset is visible because the top and bottom lines in the block have gone grey.
Progressive 25Hz source test signal also has around a 50% potential to produce poor output. The example below shows a frame with good interlacing.
And the picture below shows the same source file output with bad interlacing.
The progressive test pattern comprises two sets of 1-scan line high white lines that overlap horizontally. The left side of the block is labelled X, and has the lines at what are field 1 in interlaced operation. The block labelled Y is set for display on field 2 in an interlace output. The progressive source bad interlaced output matches the temporal content makeup seen in the interlace source bad interlace output.
I also tested a source video with 50Hz progressive content. All tests showed the same result with a correctly interlaced 1080 line output being created by 540 lines from frame 0 and 540 lines from frame 1. The source frame pairings always started with an even numberered source frame (first output frame made from source frame 0 and source frame 1, next output frame made from source frame 2 and source frame 3 etc). This conversion shows good processing and is optimal for real time replay in the i25 channel.
I also did some tests of the SDI output from a channel using Javascript and the RequestAnimationFrame() callback to move a pattern of white lines. I observed that in a channel mode set at 1080i5000 the requests occured every 20ms (i.e. field rate). Requests happened every 40ms for a channel mode set at 1080p2500, and 20ms for a channel mode set at 1080p5000.
I tried making a graphic that used frame based animation using alternate calls to requestAnimationFrame() to advance the animation. This process also showed a 50% probability of not aligning to the SDI output frame, producing the output illustrated below. This non-alignment is expected because there is no Javascript requestAnimation callback that identifies when an output frame is starting, the callback just identifies a field is starting.
I mused on what could be happening. So the following is speculation (I'm not yet sufficiently familiar with the timing processes in the server C++ code and buffer passing from producers and to consumers).
My understanding from other postings here and in the CasparCG forum is that Server version 2.3 uses a progressive process through the channel mixer, thus converting an interlace source frame into two progressive frames at double the source frame rate. If the mixer is set to "pass all" (no processing required) the SDI producer strips out the unwanted extra lines inserted by the producer making two progressive frames back into a single output frame.
How does the process pipeline know the source field id from the progressive frame formed by that field? If this data is not available, how can the consumer know which progressive frame is originally a field type 1 enabling correct reconstruction of a source frame? A slip of one progressive frame in the process chain would create the stable merge of content from two source frames we see in practical testing.
I would also suggest that there is a requirement in the CEF code to add a special reqestAnimationFrame() callback that is called only when there is an output interlace frame starting, thus synchronising the CEF animation to the SDI field ordered output.
Andy Woodhouse.
I am looking into this for a client, as part of investigating moving to 2.3 from 2.1. We have observed the same field order mismatch when playing interlaced clips, and have also found that when looping clips it will often switch between the two states. While it can be 'resolved' by forcing deinterlacing for the clips, this does mean that sometimes the output will be purely taken from the interpolated rather than source data, and so will be rather lossy.
My understanding from other postings here and in the CasparCG forum is that Server version 2.3 uses a progressive process through the channel mixer, thus converting an interlace source frame into two progressive frames at double the source frame rate. If the mixer is set to "pass all" (no processing required) the SDI producer strips out the unwanted extra lines inserted by the producer making two progressive frames back into a single output frame.
Yes, when doing 50i, the mixer is running 50p and the decklink consumer interleaves two of the mixed frames into one for output.
How does the process pipeline know the source field id from the progressive frame formed by that field? If this data is not available, how can the consumer know which progressive frame is originally a field type 1 enabling correct reconstruction of a source frame? A slip of one progressive frame in the process chain would create the stable merge of content from two source frames we see in practical testing.
Quite simply it doesnt know. And from a bit of looking at how the ffmpeg producer is setup, it cant know. It is setup to within the filter to convert the framerate to the 50p that the channel is running at, and optionally deinterlace the clip. I dont see how it could give us the needed metadata to know which field a frame belongs to. The best option I can see is to count the frames it gives us and pair them up as field1 and field2.
I have a couple of ideas on how to propagate this throughout the channel that I shall experiment with. Both do require changing method signatures on producers and consumers, so are going to be quite a sprawling change
I would also suggest that there is a requirement in the CEF code to add a special reqestAnimationFrame() callback that is called only when there is an output interlace frame starting, thus synchronising the CEF animation to the SDI field ordered output.
Im not sure that we can do that. CEF is very bad at giving frames on a timer. If nothing changed, it doesnt provide us with a frame, and combine that with the the async nature of we call reqestAnimationFrame()
and some time later we get a frame, we wouldnt be able to reliably figure out if what we were given by CEF was field1 or field2
Julusian, thanks for the response, very useful data in there.
I saw the note from dotarmin earlier in this thread about setting the ffmpeg.producer.auto-interlace
flag to all in casparcg.config. I asume this is the ffmpeg producer tag as in:
<ffmpeg>
<producer>
<auto-deinterlace>all</auto-deinterlace>
<threads>4</threads>
</producer>
</ffmpeg>
The above is the setting extracted from my test channel casparcg.config. Have I missed a potential setting?
I created another 3 signals to help look at the looping operations. These are 4 second loops (available at same location as other test signals above), so the looping is quite fast, but the embedded audio tone is set to have precise phase jumps if a field or frame is dropped or repeated. The phase jump makes the presence of an issue very clear.
I checked the looping of a channel in 1080i5000, 1080p2500 and 1080p5000 and found all of them fail to output the final time block in the video (Issue #1365 applies I think). In interlaced operation this drops the final field, thus forcing the interlace mode jump at the loop point, causing a quality switch and potential temporal discontinuity. Seeing a temporal discontinuity on an in vision monitor feed is the trigger for my investigations.
If only we could all drop this 1920's bandwidth control called interlace and stick to progressive production and distribution life would be much easier!
Andy
I have a PR open for this #1440 I would appreciate some further testing from those who have experienced this issue before I merge it
I ran a series of tests on both interlaced and progessive channel timelines. No issues related to this test build were seen.
Testing method CasparCG channel 1 outputs via thunderbolt to a Blackmagic Ultrastudio 4k. SDI output of Ultrastudio directly connects to Blackmagic Hyperdeck SDI input. Hyperdeck configured to record using ProRes 422 LT codec. Recorded files FTP transferred to iMac where they are examined in Quicktime Player and/or DJV2 (multiplatform player tool). The CasparCG playout and Hyperdeck recording controlled by Elgato Stream Deck with BitFocus Companion software programmed to provide consistent testing timing.
The video source files use test patterns that identify the source field and frame. The audio is tone that causes a break/click if any field or frame is repeated or dropped.
Most tests started with the source file loaded in the channel prior to the play command. A few tests examined behaviour when the channel was initially empty and the play command caused the file to load in the channel. (See Note 1 below)
Overall result: PASS Source files that match timeline properties (eg 1080i50 source on 1080i5000 channel) show content from correct source frame. Tests made for PAL, 1080i5000 and 1080p5000 channel operations.
Tests included play clip once and in loop mode. Looping in server 2.3.3_LTS causes toggle between correct source and interpolated source frame. This test build correctly maintains source frame structure. (See note 2 below)
Non-native source files, for example 1080p25 source on 1080i5000 channel have optimal mapping from source media to channel output.
Note 1: Some issues were seen and heard when the channel started empty and the play command caused the file to be loaded and immediately played. Typically this is seen as a small stutter near the start of the file. A 1080i50 source file played in a 1080i5000 CasparCG channel output the sequence below:
Frame 0 – field 1 interpolated Frame 0 – full frame Frame 1 – full frame Frame 2 – field 1 interpolated Frame 2 – full frame Frame 3 – full frame Frame 4 – full frame etc.
I think I have seen the same issues on other versions of CasparCG server.
Note 2: Issue #1365 reported that the final frame was not played. With the test build this shows in loop playback. For an interlaced channel and interlaced source file the final output of a loop is an interpolated field 1 of source frame content instead of the full frame. For a progressive channel the final frame of the source is not shown. Both examples have audio breakup because of the missing content.
I have merged the PR that I believe will resolve this, and in my testing has ensured playback uses the correct field alignment. I'm going to close this issue unless someone still has issues.
So far this field alignment is only supported by decklink producer&consumer, and ffmpeg produer. any other producers and consumers are not looking at the field alignment information, so will likely behave as before. I am expecting them to be fixed up as needed by someone who knows how to test them in further PRs
Expected behaviour
Interlaced material should play fields in correct order.
Current behaviour
Incorrect fields order when playing interlaced material
Notes
The attached file has been rendered with latest After Effects. Can be reproduced easily by rendering a new file.
Steps to reproduce
Environment
Attachments
LAS-Kortbumper1-NoLogo-Caspar.mov.zip