MiSTer-devel / Main_MiSTer

Main MiSTer binary and Wiki
GNU General Public License v3.0
3.02k stars 325 forks source link

Sending video meta data through direct video #808

Open thehughhefner opened 1 year ago

thehughhefner commented 1 year ago

Hi developers, I'm reaching out to you requesting the possibility of adding metadata information through direct video so that external hardware such as scalers can use that information. A use case is using a retrotink 4k where it takes metadata with the direct video on how to exactly crop and scale the game. Thanks

sorgelig commented 1 year ago

Direct video mode is very non-standard use of HDMI. Not sure if such data is possible. Adding metadata will make framework more complex which is not good idea but i'm open to look at possible code addition if someone would like offer. Direct video is mainly added to get an original analog video through a simple and preferably dumb converter like AG6200.

thehughhefner commented 1 year ago

Thank you for the explanation. I'll leave this ticket open so that other devs can chime in. About Retrotink, I belive that Mike Chi can elaborate further :)

birdybro commented 1 year ago

I already talked to Mike Chi about this a week ago as well. It shouldn't be a major problem as the community is allowed to develop presets and share them with each other for the Retrotink4k. The solution is to use these presets when they become available. It should only impact a handful of cores like the SNES core as well.

The overlap of someone who can afford a Retrotink4k and wants to use the MiSTer with it will be a very small number of people anyways.

orbiten commented 9 months ago

The problem is that uses will then have to manually switch profiles whenever they switch core on mister. Adding metadata would make it work automatically.

I for one would be very happy if this was added!

Toryalai1 commented 9 months ago

Yeah, agreed.

cobhc2019 commented 9 months ago

Another vote here for adding metadata.

unabletoconnect2429 commented 9 months ago

This would be a great feature. I'll be connecting my mister to my retrotink 4k (once it arrives)

atrac17 commented 9 months ago

Should totally do it. 100% agree. Wonder why it was never done when Mike Chi asked over 6 months ago and decided to support the next platform of FPGA gaming (MARS).

Guspaz commented 9 months ago

We're seeing a pretty constant flow of RT4K owners complain about bad results with MiSTer direct video due to the missing metadata, this would make things go a lot smoother for MiSTer users who want to do further processing of the digital signal. This problem will only get worse once the Morph 4K and OSSC Pro both hit general availability.

blzla commented 9 months ago

There is an overlap between RetroTINK 4K users and MiSTer users, as can be seen by the feedback on Discord and social media. Both are solutions for people who want a great experience from classic games and are willing to spend a bit extra for it. External scalers like the RT4K or OSSC Pro can improve MiSTer's video output by providing better deinterlacing and higher resolution output. Metadata would make the experience a lot smoother.

birdybro commented 9 months ago

Should totally do it. 100% agree. Wonder why it was never done when Mike Chi asked over 6 months ago (lolsnip).

For the record, Mike Chi didn't ask any MiSTer developers that I'm aware of, Toya asked on his behalf. I reached out to Mike Chi directly and (after some back and forth understanding what he needed) he said, "I just want to be able to say I brought it up" so it didn't seem like a very high priority. He told me that community made profiles could be shared and this could address it. This was 4 months ago, not 6 months ago.

Despite the rude thumbs-downs from people randomly necro'ing this issue months later, I did bring this up in an internal development channel in discord, I listened to Mike Chi and I passed on the message in a proactive way after trying to gather information and help.

Also I let Mike Chi know that he should contact sorgelig on facebook directly and then submit a PR to get it added, but he wasn't interested.

mjj03301977 commented 9 months ago

Part of the issue that I have recently learned is that the direct video mode for HDMI is not sending a “raw” untouched signal. Apparently there is pixel repetition and super resolution being applied. Sounds like we need another option for external scalers to receive truly raw data via digital HDMI.

birdybro commented 9 months ago

Part of the issue that I have recently learned is that the direct video mode for HDMI is not sending a “raw” untouched signal. Apparently there is pixel repetition and super resolution being applied. Sounds like we need another option for external scalers to receive truly raw data via digital HDMI.

This was by design so it can work with cheap AG620x DACs, read above. Direct video mode wasn't designed to work with a $750 scaler that was released 4 years later.

mjj03301977 commented 9 months ago

Fair enough, so this is a feature request. And I would think it would be helpful for all scalers…not just RT4K.

birdybro commented 9 months ago

What other scalers would this help out with that you know of?

Guspaz commented 9 months ago

The OSSC Pro and PixelFX Morph 4K immediately come to mind, but any scaler with HDMI input, past/present/future, could potentially benefit.

The reason why this is coming up now (the "necro") is because the three scalers mentioned are beginning to ship to actual customers, who are turning around and complaining to the scaler developers when they get unexpected results with mister direct video mode.

saboten99 commented 9 months ago

Part of the issue that I have recently learned is that the direct video mode for HDMI is not sending a “raw” untouched signal. Apparently there is pixel repetition and super resolution being applied. Sounds like we need another option for external scalers to receive truly raw data via digital HDMI.

This was by design so it can work with cheap AG620x DACs, read above. Direct video mode wasn't designed to work with a $750 scaler that was released 4 years later.

Why are you guys still obsessed with AG6200 junk? These DACs are atrocious.

mikechi2 commented 9 months ago

So far the direct video works great. There's really only two issues:

  1. Super resolution/pixel-repeat to overcome minimum HDMI clock requirements
  2. Extra blanking/padding to massage the AG6200...?

The only metadata that's needed is the de-repetition factor and how to adjust the crops from what is presented by the HDMI DE signal. Probably less than 10 bytes of data. Maybe packing the data into an HDMI infoframe would work. The actual HDMI video itself does not need to be changed at all.

If one wanted to get extra fancy, the crops could even contain data on how to overcome the console's overscan/padding (for example cut out the unused space in Master system), but this might be more work than is desirable.

This would allow the scaler to have a 1:1 replica of the console's frame buffer and be a much simpler, automated user experience.

wickerwaka commented 9 months ago

It would just be a few bytes of data, but calculating it and implementing that for every core is an unknown amount of work. Cores don't typically provide any meta data, they just generate a video signal and so it would need to be inferred in the sys framework, passed back to main and then placed into an infoframe.

I implemented core name metadata and gave a version to various scaler developers to try out. It's not ideal but it would allow for some sane presets.

It would be possible to remove the extra blanks when the output mode is RGB, since it is there to better support YPbPr. However when I did that the first line was flickering in some resolutions. It's hard to know whether that is an issue with mister code, the DE-10 HDMI IC or the HDMI receiver chip. Without some kind of HDMI analysis device it's just guess work.

sorgelig commented 9 months ago

The only viable option is possible is output with real blanking signals. Scalers should extract active portion of video data as all other HDMI displays and do whatever they want with that. HDMI output protocol is implemented by ADV7513 chip which is in charge to inject all required metadata. Implementing custom HDMI metadata means implementing it in "software" which will greatly increase resources and impact some heavy cores like PSX, N64, Saturn. Also with implementing additional muxers and calculation in video clock path will give instability in some cores where timings are already too tight.

atrac17 commented 9 months ago

So far the direct video works great. There's really only two issues:

  1. Super resolution/pixel-repeat to overcome minimum HDMI clock requirements
  2. Extra blanking/padding to massage the AG6200...?

The only metadata that's needed is the de-repetition factor and how to adjust the crops from what is presented by the HDMI DE signal. Probably less than 10 bytes of data. Maybe packing the data into an HDMI infoframe would work. The actual HDMI video itself does not need to be changed at all.

If one wanted to get extra fancy, the crops could even contain data on how to overcome the console's overscan/padding (for example cut out the unused space in Master system), but this might be more work than is desirable.

This would allow the scaler to have a 1:1 replica of the console's frame buffer and be a much simpler, automated user experience.

I ASKED MIKE CHI TO LEAVE A COMMENT. IT WAS ME, I TALKED TO MIKE CHI.

mjj03301977 commented 9 months ago

Part of the issue that I have recently learned is that the direct video mode for HDMI is not sending a “raw” untouched signal. Apparently there is pixel repetition and super resolution being applied. Sounds like we need another option for external scalers to receive truly raw data via digital HDMI.

This was by design so it can work with cheap AG620x DACs, read above. Direct video mode wasn't designed to work with a $750 scaler that was released 4 years later.

Why are you guys still obsessed with AG6200 junk? These DACs are atrocious.

We are talking about raw digital video out the HDMI port, so no DAC should be involved as far as I am aware.

sorgelig commented 9 months ago

Not sure about de-repetition. Need to investigate if it won't increase complexity if code too much. Also i'm not sure if ADV7513 allows to output lower clock. If it will allow, then probably without audio.

Anyway, i don't have equipment to test such video output. AG6200 won't work with low pixel clock.

Why are you guys still obsessed with AG6200 junk? These DACs are atrocious.

To claim that, you have to tell to which competitors you are comparing. So far AG6200 is the only chip supporting any resolution and providing great analog output with 256 levels per channel.

saboten99 commented 9 months ago

Not sure about de-repetition. Need to investigate if it won't increase complexity if code too much. Also i'm not sure if ADV7513 allows to output lower clock. If it will allow, then probably without audio.

Anyway, i don't have equipment to test such video output. AG6200 won't work with low pixel clock.

Why are you guys still obsessed with AG6200 junk? These DACs are atrocious.

To claim that, you have to tell to which competitors you are comparing. So far AG6200 is the only chip supporting any resolution and providing great analog output with 256 levels per channel.

https://docs.google.com/spreadsheets/d/1nbepvFFBVsLrs1myOiVWqMVLp9-oB9TataRmVlcyqlA/

sorgelig commented 9 months ago

https://docs.google.com/spreadsheets/d/1nbepvFFBVsLrs1myOiVWqMVLp9-oB9TataRmVlcyqlA/

And? Where is the test among all cores? SNES produces kind of ideal video, so it's not a good test bench. Many cores have non-standard resolutions and refresh rates where many chips will fail. Also prices should be shown. Device providing subtle difference but costing 20 times more won't be a good competitor.

saboten99 commented 9 months ago

https://docs.google.com/spreadsheets/d/1nbepvFFBVsLrs1myOiVWqMVLp9-oB9TataRmVlcyqlA/

And? Where is the test among all cores? SNES produces kind of ideal video, so it's not a good test bench. Many cores have non-standard resolutions and refresh rates where many chips will fail. Also prices should be shown. Device providing subtle difference but costing 20 times more won't be a good competitor.

Icybox DAC is less than 20$.

How can something be a competitor when it crushes blacks and has voltages all over the place?

orbiten commented 9 months ago

Who is responsible for adding pixel repetition and extra blanks and stuff .. is that done by cores themselves or by the mister framework that cores are built on top of ?

Toryalai1 commented 9 months ago

Who is responsible for adding pixel repetition and extra blanks and stuff .. is that done by cores themselves or by the mister framework that cores are built on top of ?

It’s part of the framework

orbiten commented 9 months ago

Who is responsible for adding pixel repetition and extra blanks and stuff .. is that done by cores themselves or by the mister framework that cores are built on top of ?

It’s part of the framework

In that case the framework already know about the original resolution .. so no core changes would have to be made in order to add a new "really direct video (for real and unmodified)" output option that just outputs the raw pixels ?

As long as the ADV7513 supports encoding theese "out of hdmi spec" resolutions ?

birdybro commented 9 months ago

The OSSC Pro and PixelFX Morph 4K immediately come to mind, but any scaler with HDMI input, past/present/future, could potentially benefit.

The reason why this is coming up now (the "necro") is because the three scalers mentioned are beginning to ship to actual customers, who are turning around and complaining to the scaler developers when they get unexpected results with mister direct video mode.

Is it currently confirmed that the Morph4k and the OSSC Pro will require this metadata to make this work and that the developers of those two scalers will use it if provided? Sorry for the probing questions, but it's important to establish how strong the case is for adding something, I assure you that I'm asking in good faith here.

Why are you guys still obsessed with AG6200 junk? These DACs are atrocious.

Dirt cheap, readily available, known to work, etc... There is no obsession, it's just what was easily available to all users to purchase online if they wanted vga output without an analog io board when this was designed.

https://docs.google.com/spreadsheets/d/1nbepvFFBVsLrs1myOiVWqMVLp9-oB9TataRmVlcyqlA/

Regarding the competing DACs from the google spreadsheet that has been making the rounds... the only ones that are as readily available and in the same price range also already work fine with the current method without modification to the framework (aka ones that use the LT8621SX, CS5210, IT6902, IT6892FN, and IT6604E), so I don't see the point in complaining about the AG620x in relation to this feature request, there are plenty more you can buy that have better levels already, if you want. This is wholly irrelevant to the point of this feature request though.

I agree with sorgelig regarding the skepticism of the testing methodology, testing merely the SNES core is a bit strange. The reference levels on the second tab cited are from a real SNES using component and not composite, I assume? Shouldn't it be a test of composite instead as that is how a real SNES operated on real CRTs contemporaneously? Or shouldn't the reference have been a system that output RGB natively if you are going to compare against RGB output? Or rather... is it a valid assumption to assume that one SNES modded for YPbPr should match RGB output of a MiSTer core? Please forgive my ignorance on this, not trying to discredit it, but that part seemed odd to me personally.

Regarding the DAC testing, did they have hdmi_limited=0 or hdmi_limited=2 set in their MiSTer.ini? You can see that in the component voltage tests the reference range is 16-255, so I hope hdmi_limited=2 was used for these tests of the AG620x since that option was added specifically to address this. If hdmi_limited=0 was set then this would explain the crushed blacks on the AG620x, and if that were the case then that is just a case of user error. The testing should be updated using the proper intended settings if that is the case.

Toryalai1 commented 9 months ago

I've seen some pics where a user has connected their mister (via direct video) to the retrotink 4k, and yeah, there are some issues. I hope this can be addressed somehow.

wickerwaka commented 9 months ago

In that case the framework already know about the original resolution .. so no core changes would have to be made in order to add a new "really direct video (for real and unmodified)" output option that just outputs the raw pixels ?

As long as the ADV7513 supports encoding theese "out of hdmi spec" resolutions ?

I don't think there is any point in trying to remove the pixel repetition. It is part of the HDMI spec and it exists primarily for this exact reason (encoding legacy video resolutions). In direct video mode the HDMI clock is tied to the cores video clock, which is usually at least 4x the pixel clock, changing this to a lower clock speed would be a significant change even if it worked which it almost certainly wouldn't. So pixel repetition is likely just a fact of life with HDMI, the ideal thing for MiSTer to do would be to communicate this pixel repetition factor, which is something the HDMI spec supports but is not something that is currently being done.

As far as the blanks/framing is concerned. Like I said, it was been tried and it had problems. It's going to take some further collaboration to work out what the problem is an how to fix it.

lexrj commented 9 months ago

It would just be a few bytes of data, but calculating it and implementing that for every core is an unknown amount of work. Cores don't typically provide any meta data, they just generate a video signal and so it would need to be inferred in the sys framework, passed back to main and then placed into an infoframe.

But if I understand correctly, the cores themselves are already outputting the original information, and it's the framework adding the pixel repetition and such that the scaler wants to remove, correct? The cores themselves are already outputting the "raw" video, so would there need to be per-core work?

There's even specification for how to encode the pixel repetition mode and factor in the AVI InfoFrame for the ADV7513: https://www.analog.com/media/en/technical-documentation/user-guides/ADV7513_Programming_Guide.pdf section 4.3.4

birdybro commented 9 months ago

There's even specification for how to encode the pixel repetition mode and factor in the AVI InfoFrame for the ADV7513: https://www.analog.com/media/en/technical-documentation/user-guides/ADV7513_Programming_Guide.pdf section 4.3.4

You mean this?

https://github.com/MiSTer-devel/Main_MiSTer/blob/4483501011d2a94e20b4e00023757963b84a5c7f/video.cpp#L1406

image

If you mean to update the VIC register bits at 0x3d:

image

image

these are the only options, which doesn't cover all of the resolutions that cores will output, if I'm understanding your proposal right.

sorgelig commented 9 months ago

Direct video pixel repetition doesn't follow HDMI options from table above. Core simply generates video clock which fits HDMI specs. Repetition not exactly can be power of 2. It can be 3, 5, 7, whatever. Direct Video made for ANALOG output, so it's simply that, nothing more.

My POV: scaler gets video clock and video data. Adding simple counter with data compare it's easy to find a smallest amount cycles between data change. This will be the pixel repetition number. Is it hard to implement in $750 scaler?

Those who want to add an option to send metadata, may try to do it. This is how opensource works. BUT. It doesn't mean such code will be immediately accepted, since it needs to fit some requirements such as: 1) as small code as possible 2) doesn't produce tight timings (some cores output video clock up to ~60Mhz). 3) some additional consideration may arise.

mikechi2 commented 9 months ago

Direct video pixel repetition doesn't follow HDMI options from table above. Core simply generates video clock which fits HDMI specs. Repetition not exactly can be power of 2. It can be 3, 5, 7, whatever. Direct Video made for ANALOG output, so it's simply that, nothing more.

My POV: scaler gets video clock and video data. Adding simple counter with data compare it's easy to find a smallest amount cycles between data change. This will be the pixel repetition number. Is it hard to implement in $750 scaler?

Those who want to add an option to send metadata, may try to do it. This is how opensource works. BUT. It doesn't mean such code will be immediately accepted, since it needs to fit some requirements such as: 1) as small code as possible 2) doesn't produce tight timings (some cores output video clock up to ~60Mhz). 3) some additional consideration may arise.

We already do "pixel counting" for noisy analog sources and in theory this works for digital sources. However, getting the ground truth would be cleaner as that would allow the scaler to react instantly vs needing to buffer some data first then calculate.

I am curious to what extent the "framework" is aware of the pixel repetition versus the original console output? If the "framework" has both pieces of info, then the core does not need to change and this becomes a minor addition on the ARM side which should not be add timing complexity, consume extra resources, etc. etc.

Note: the larger issue, imo., is still extra large blanking periods which is not so easily reverse engineered by a sink.

sorgelig commented 9 months ago

framework (FPGA side) gets both pixel enable and video clock from where pixel repetition can be calculated. Blanking as said earlier can be original. Probably it can be switched automatically to original blanking if RGB mode is used, but it needs to be checked and tested against side effects.

mikechi2 commented 9 months ago

If a dev, who understands the complexities, wants to give a serious shot at this and look at the PR and blanking, I'm more than happy to send them a RT4K (and I think this feature would eventually go beyond just the RT4K as well).

marceloMedeiros commented 9 months ago

If a dev, who understands the complexities, wants to give a serious shot at this and look at the PR and blanking, I'm more than happy to send them a RT4K (and I think this feature would eventually go beyond just the RT4K as well).

Aren't you a FPGA developer as well? Im sure a PR from your implementing this would be welcome'd

memmam commented 9 months ago

Um, just to clarify because I think there might be some confusion, Mike is using 'PR' to mean 'pixel repetition', not 'pull request'.

saboten99 commented 9 months ago

Dirt cheap, readily available, known to work, etc... There is no obsession, it's just what was easily available to all users to purchase online if they wanted vga output without an analog io board when this was designed.

https://docs.google.com/spreadsheets/d/1nbepvFFBVsLrs1myOiVWqMVLp9-oB9TataRmVlcyqlA/

Regarding the competing DACs from the google spreadsheet that has been making the rounds... the only ones that are as readily available and in the same price range also already work fine with the current method without modification to the framework (aka ones that use the LT8621SX, CS5210, IT6902, IT6892FN, and IT6604E), so I don't see the point in complaining about the AG620x in relation to this feature request, there are plenty more you can buy that have better levels already, if you want. This is wholly irrelevant to the point of this feature request though.

I agree with sorgelig regarding the skepticism of the testing methodology, testing merely the SNES core is a bit strange. The reference levels on the second tab cited are from a real SNES using component and not composite, I assume? Shouldn't it be a test of composite instead as that is how a real SNES operated on real CRTs contemporaneously? Or shouldn't the reference have been a system that output RGB natively if you are going to compare against RGB output? Or rather... is it a valid assumption to assume that one SNES modded for YPbPr should match RGB output of a MiSTer core? Please forgive my ignorance on this, not trying to discredit it, but that part seemed odd to me personally.

Regarding the DAC testing, did they have hdmi_limited=0 or hdmi_limited=2 set in their MiSTer.ini? You can see that in the component voltage tests the reference range is 16-255, so I hope hdmi_limited=2 was used for these tests of the AG620x since that option was added specifically to address this. If hdmi_limited=0 was set then this would explain the crushed blacks on the AG620x, and if that were the case then that is just a case of user error. The testing should be updated using the proper intended settings if that is the case.

The AG6200 isn't appropriate for limited range either. He's already tested that. It performs to its price point; it’s just not very good.

The SNES is just a reference point. The main issue is that component video is 7.14mv/1 IRE and RGB is 7mv/1 IRE. He's testing to see if the DAC is capable of hitting these reference points in steppes of 10 IRE. The AG6200 crushes blacks under 30 IRE so bad it affects your ability to play games with darker scenes and removes a substantial amount of detail from the picture.

Like you said the conversation is somewhat off topic, I just found it interesting that AG6200 compatibility is being cited as one of the reasons we’re not fully supporting this scaler. Especially with how poorly it performs.

sorgelig commented 9 months ago

I just found it interesting that AG6200 compatibility is being cited as one of the reasons we’re not fully supporting this scaler

When Direct Video was implemented, these 4K scalers weren't exist. You need to learn to read without biased imaginations.

birdybro commented 9 months ago

I just found it interesting that AG6200 compatibility is being cited as one of the reasons we’re not fully supporting this scaler.

That was not what was said at all. You are misunderstanding the conversation entirely.

Guspaz commented 9 months ago

It's not really relevant either way. This issue is about trying to improve the compatibility of the MiSTer with HDMI scalers. The current direct HDMI mode isn't ideal for that, and there's various ways that it could be improved, such as additional metadata or a separate mode.

birdybro commented 9 months ago

It's not really relevant either way. This issue is about trying to improve the compatibility of the MiSTer with HDMI scalers. The current direct HDMI mode isn't ideal for that, and there's various ways that it could be improved, such as additional metadata or a separate mode.

You probably missed my question earlier.

Has it been confirmed that the OSSC Pro and the Morph4k would benefit from this as well?

Toryalai1 commented 9 months ago

In any case, Mike Chi has mentioned that he would offer a dev kit to any dev that is willing to tackle this issue. I would personally vote for Sorg or Wickerwaka. Also, since people, with mister, started to get hands on the retrotink 4k, they could also assist in testing. Let’s make a community effort! I’m positive we can come to a solution

Guspaz commented 9 months ago

Has it been confirmed that the OSSC Pro and the Morph4k would benefit from this as well?

Feel free to ask their developers. Could they benefit from it? Definitely. Would they? That's up to them, I can't speak for them. At least one scaler developer has committed to support it, and I think it's highly likely that at least one of the others would follow suit. Even if it's just the RT4K, two thousand of them sold out in a few minutes on launch day, so there's going to be a decent install base, and there's a high overlap between RT4K and MiSTer owners. The same sort of people who are willing to spend a few hundred dollars on a scaler are the people willing to spend a few hundred dollars on a MiSTer.

There's also now a bounty for this feature in the form of a free RT4K to whoever is willing to implement this.

mjj03301977 commented 9 months ago

I am happy to help test once I receive my RT4K (hasnt even shipped yet though).

saboten99 commented 9 months ago

I just found it interesting that AG6200 compatibility is being cited as one of the reasons we’re not fully supporting this scaler

When Direct Video was implemented, these 4K scalers weren't exist. You need to learn to read without biased imaginations.

Did you design analog io 6.1's output based on the AG6200's output, or something else? Based on some replies in this thread, I'm starting to think that board is an R–2R implementation of the AG620x series of DACs.

Guspaz commented 9 months ago

Did you design analog io 6.1's output based on the AG6200's output, or something else? Based on some replies in this thread, I'm starting to think that board is an R–2R implementation of the AG620x series of DACs.

Let's try to keep on topic, that's quite irrelevant for this issue.