swaywm / wlroots

A modular Wayland compositor library
https://gitlab.freedesktop.org/wlroots/wlroots/
MIT License
2.15k stars 342 forks source link

Interlaced output modes #3038

Open ElkMonster opened 3 years ago

ElkMonster commented 3 years ago

Hi,

i just wanted to report that there's a need for interlaced output mode support. Interlaced modes are currently skipped (per https://github.com/swaywm/wlroots/pull/1382, as a solution to https://github.com/swaywm/sway/issues/3167). In the commit, the author says: "a better solution could be made if there's a need to".

My use case is rather special, so I'd understand if my request should be declined. Here it is:

The company I work for creates content for various TV stations around the world. Some TV stations require very specific ways of content delivery. In this particular case, we have to provide a PC that provides the content via SDI (Serial Digital Interface) at 1080i50, using an HDMI to SDI adapter. We're using Sway on that machine because it allows us to easily control the output window exactly as we want to (activate fullscreen, move to a specific display, hide mouse cursor).

For now, I've created a patched version of wlroots that simply skips progressive modes instead of interlaced modes, and installed it on the concerned box. We'll be able to live with that hack because that box will probably rarely be touched, however, I'd be happy to be able to use upstream wlroots in the future.

Thanks for your consideration Sascha


wlroots has migrated to gitlab.freedesktop.org. This issue has been moved to:

https://gitlab.freedesktop.org/wlroots/wlroots/-/issues/3038

emersion commented 3 years ago

I'd rather not add support for a dying standard. You're the first one to request this since the creation of wlroots.

sjnewbury commented 3 years ago

@emersion I'm not sure calling interlaced display modes a dying standard is really accurate. It's very much the predominant broadcast standard because it allows the maximum definition and temporal resolution in the least bandwidth. As long as bandwidth is limited and there is a push for higher definition screens, broadcasters and videographers will continue to use interlacing.

It never really was a standard with respect to PC displays, despite a short period in the 90s as a cheap way to achieve 1024x768 on limited hardware. If wlroots is purely a PC technology, I think you're probably right not to add support, since it's unlikely anybody would use hardware incapable of generating display output with sufficient bandwidth to drive their screen technology at full resolution, probably... (Up until recently I was actually running a patched MythTV frontend on an Atom Pineview system on an old LCD TV that only supported 1920x1080i, and prior to that system I used a CRT with a Radeon with interlaced field synchronisation!)

On the other hand, making wlroots useful to broadcasters and video production might not be a bad idea, even if there hasn't been a visible demand for it, it's something which I suspect would be utilized quite eagerly by that particular niche if they came to know what could be achieved.

Just some thoughts...

emersion commented 3 years ago

cc @cyanreg for the video/broadcasting side of things

cyanreg commented 3 years ago

As long as bandwidth is limited and there is a push for higher definition sc ureens, broadcasters and videographers will continue to use interlacing.

That's plain wrong. 4k video is never interlaced. HEVC and AV1 don't even support interlacing. 720p video was never interlaced. A historical quirk made 1080i video the standard in professional broadcasting, and that too is on its way out. All modern TVs made in the past 16 years support progressive content. In fact, true interlaced monitors have not existed since CRTs were phased out. Nowadays deinterlacing algorithms butcher the image in such a way that its impossible to justify the alleged savings. Even OTT set-top boxes are outputting progressive. Enabling support for interlacing in wlroots would do more harm than good to anyone wanting to hook up a TV to their computer, since DRM makes it easy for API users to pick them. That's the reason why it was disabled, for a lot of TVs wlroots would pick 1080i59.94 instead of 1080p60, despite both being advertised as supported.

On the other hand, making wlroots useful to broadcasters and video production might not be a bad idea, even if there hasn't been a visible demand for it, it's something which I suspect would be utilized quite eagerly by that particular niche if they came to know what could be achieved.

What you're wanting to do with wlroots is a hack, both in the broadcasting domain and in the desktop compositor domain. Please, use proper tools instead. Yes, you have to pay for a lot of them, but anyone who's working in broadcasting has enough to pay for them. Hacking open source projects and offloading the maintenance burden isn't exactly endearing. We carry a lot of such burden at FFmpeg. And no burden is heavier than old broadcasting gear burden.

The company I work for creates content for various TV stations around the world. Some TV stations require very specific ways of content delivery. In this particular case, we have to provide a PC that provides the content via SDI (Serial Digital Interface) at 1080i50, using an HDMI to SDI adapter. We're using Sway on that machine because it allows us to easily control the output window exactly as we want to (activate fullscreen, move to a specific display, hide mouse cursor).

That's such a duck-taped setup my friends who work in broadcasting would unceremoniously dismiss it.

sjnewbury commented 3 years ago

@cyangreg Fair enough.

I'm a little surprised though to read 1080i is "on the way out", technologically superseded for sure but 4K/60Hz progressive isn't going to replace the vast swathe of satellite, cable, and digital terrestrial channels out there anytime soon, there just isn't enough bandwidth and the installed base is just too significant. I've never had a STB which supports 1080p output although I'm sure the latest 4K models do. 720p was never a broadcast standard in most of the world, I've never seen it myself outside of YouTube!

Here in the UK : "All HD channels in the UK broadcast at 1080i, apart from BT Sport Ultimate and Virgin TV Ultra HD which are broadcast at 4K.[1] HD channels can dynamically switch between 1080i/25 and 1080p/25 when broadcast via Freeview HD. (Wikipedia). 1080i 50/60 fields/s is superior to 1080p 25/30 frames/s where temporal resolution is needed even with imperfect de-interlacing techniques.

Anyway, not arguing for the feature, I was just a little surprised with what you wrote and your categorical rejection of my perspective on broadcast tech.

Edit: I also don't see why F/OSS shouldn't be used in professional environments generally. It is used in a great many, not sure why broadcasting/TV production should be an exception.

kennylevinsen commented 3 years ago

We are not opposing the use of FOSS in the broadcast industry in any way.

First of all, I'll quote Wikipedia regarding 720i:

720i (720 lines interlaced) is an erroneous term found in numerous sources and publications. Typically, it is a typographical error in which the author is referring to the 720p HDTV format. However, in some cases it is incorrectly presented as an actual alternative format to 720p.[3] No proposed or existing broadcast standard permits 720 interlaced lines in a video frame at any frame rate.

Back on topic: We just do not find there to be any technical merit for supporting interlaced output modes, as no output natively display interlaced content, and de-interlacing is terrible. The incidental analogue bandwidth optimization it grants as a side-effect of older CRT operation is only relevant in analogue broadcasting. Both CRTs and analogue broadcasting are indeed on their way out.

Catering to this would be extremely niche for wlroots, and maybe even just be you alone needing it, so we'd prefer to not take the hit and general user problems it would cause upstream. Instead, if you want to use this, I'd recommend patching locally to undo the PR. Note that you will probably get additional artifacts if you display deinterlaced content like this.

For digital broadcasting, bandwidth optimization of digital content should be done by tuning the encoder used for compressing your content, not by interlacing. I am fairly confident that modern codecs will present far superior "temporal resolution" to any interlace hacks with less bandwidth (they can do tricks like shifting content), and that interlacing probably only causes trouble for the encoder. I doubt its use here could be for anything other than historic reasons.

sjnewbury commented 3 years ago

We are not opposing the use of FOSS in the broadcast industry in any way.

I was responding to:

What you're wanting to do with wlroots is a hack, both in the broadcasting domain and in the desktop compositor domain. >>Please, use proper tools instead. Yes, you have to pay for a lot of them, but anyone who's working in broadcasting has >>enough to pay for them.

First of all, I'll quote Wikipedia regarding 720i:

720i (720 lines interlaced) is an erroneous term found in numerous sources and publications. Typically, it is a typographical error in which the author is referring to the 720p HDTV format. However, in some cases it is incorrectly presented as an actual alternative format to 720p.[3] No proposed or existing broadcast standard permits 720 interlaced lines in a video frame at any frame rate.

I never mentioned 720i. I just mentioned 720p wasn't used here in the UK, broadcasters were given the choice with the rollout of HDTV and they use similar bandwidth, but 1080i has more motion information and higher resolution. 720p probably didn't seem like such a step up from 720x576i, either which was the SDTV format here (which was already worse than PAL analogue).

720p (1280 vertical pixels × 720 horizontal lines): 921,600 pixels 1080i (1920×1080) interlaced scan: 1,036,800 pixels (~1.04 MP). 1080p (1920×1080) progressive scan: 2,073,600 pixels (~2.07 MP).

This is all historical (analogue/MPEG2/h.264), modern codecs don't support interlacing because people generally don't use interlace capable display hardware anymore, ie. CRTs. If you have to de-interlace the incoming video, you may as well have full frame digital interpolation as well, so that's what modern TVs do. The installed base of existing technology just isn't going to be replaced anytime soon.

On the other hand, in my area of interest, retro systems; particularly arcade emulators preferentially drive CRTs at the original resolution, and use interlaced modes where higher vertical resolutions are needed as the original hardware did. It is also possible to field synchronize interlaced video content so no de-interlacing is required for example from LaserDisc.

ascent12 commented 3 years ago

A long time ago, I tried to find out how it even works at the kernel API level. Do we present 2 full frames and the kernel takes the alternating lines from each? Does the kernel take 1 frame and it presents it as 2 fields? Does it expect already interlaced content? It's not obvious from the docs or the API at all, and I really didn't care enough to investigate further and said "fuck it, it's too niche to even bother".

But even if I did find out, there is currently no mechanism on the Wayland side to communicate to clients that we're using an output with interlacing and they wouldn't be able to take advantage of it. The only thing is the "interlaced" flag in linux-dmabuf, but that's the client telling the compositor they have interlaced content, not the compositor requesting it.

It's certainly not impossible for it to be added upstream, but I imagine you'll just run into a lot of apathy.

sjnewbury commented 3 years ago

A long time ago, I tried to find out how it even works at the kernel API level. Do we present 2 full frames and the kernel takes the alternating lines from each? Does the kernel take 1 frame and it presents it as 2 fields? Does it expect already interlaced content? It's not obvious from the docs or the API at all, and I really didn't care enough to investigate further and said "fuck it, it's too niche to even bother".

The framebuffer is full resolution updated at full frame rate as normal, the CRTC is programmed via the interlaced flag to scanout alternate scan lines from each subsequent frame. In the past presentation of interlaced content is something that has always been somewhat neglected**, it requires driver support to synchronize the fields to the presented content via an hardware overlay with XVideo. This was never upstreamed, but see patches here:

http://lowbyte.de/vga-sync-fields/vga-sync-fields/patches/

This is something which has at least been taken into account with the design of linux-dmabuf as you mention below.

But even if I did find out, there is currently no mechanism on the Wayland side to communicate to clients that we're using an output with interlacing and they wouldn't be able to take advantage of it. The only thing is the "interlaced" flag in linux-dmabuf, but that's the client telling the compositor they have interlaced content, not the compositor requesting it.

This is definitely a thing for planes/overlays. It only makes any sense when you can assure your horizontal timings align. I still think it might be worth it if you're using display hardware with a quality de-interlacer (like a modern higher end TV) or capable of interlaced output (like an arcade monitor) for showing old interlaced media, or the output of say an Amiga emulator in hi-res modes, or a Sega Naomi emulator.

It's certainly not impossible for it to be added upstream, but I imagine you'll just run into a lot of apathy.

** I believe some of the poor support for interlacing historically stems partially from misunderstandings people have when in comes to interlaced content, and generally with the aversion to video with high temporal resolution (aka soap opera effect). There has to be some reason why most content today is 23.97fps, it's certainly not because most people perceive it as smooth motion. Wikipedia never gets this right. There's also an association with interlacing and the visible comb artifacts present when displaying interlaced content on a progressive display, like this. In practice, the only major visible artifact is from single line horizontal elements which flicker at the frame rate (half field rate) which isn't normally an issue with non-computer generated UI elements.

Being able to display 50/60Hz interlaced content on a broadcast standard CRT from a PC was definitely worthwhile pre-HDTV for HTPC. Today, IMHO, content should be produced at 60Hz progressive, but then I guess I'm just unfashionable! :-)

ovlx commented 3 years ago

Not sure if this is helpful, but Interlaced modes can be passed via custom modes , wayfirewm supports this and it does work https://github.com/WayfireWM/wayfire/pull/899/commits , The bigger problem lies with the lack of kernel driver support, only radeon and amdgpu are capable afaik(maybe intel i915 too?), and with amdgpu, only when DC disabled on cards polaris or older, its also kinda buggy (issue is on the kernel side).

however getting interlaced content to display properly is whole other ordeal, that i have no idea about if it works or not. this is one of those things that should probably die with xorg.

sjnewbury commented 3 years ago

i915 definitely works fine, or at least the last time I checked it since I was using it on a MythTV front end connected to a TV which only supported FHD in 1080i.

If I feel motivated I might start a project to get all this working, including field sync for interlaced content, even if patches aren't accepted upstream I think it might be useful to some, in particular for retro (emulation) presentation.