Open thehughhefner opened 1 year ago
If a dev, who understands the complexities, wants to give a serious shot at this and look at the PR and blanking, I'm more than happy to send them a RT4K (and I think this feature would eventually go beyond just the RT4K as well).
Group chat initiated. Thanks Mike!
I know this is off topic, but I thought it would be good if I clarify some questions here as I seem to be called out for some things...
I agree with sorgelig regarding the skepticism of the testing methodology, testing merely the SNES core is a bit strange. The reference levels on the second tab cited are from a real SNES using component and not composite, I assume? Shouldn't it be a test of composite instead as that is how a real SNES operated on real CRTs contemporaneously? Or shouldn't the reference have been a system that output RGB natively if you are going to compare against RGB output? Or rather... is it a valid assumption to assume that one SNES modded for YPbPr should match RGB output of a MiSTer core? Please forgive my ignorance on this, not trying to discredit it, but that part seemed odd to me personally.
The tests were done with RGB for one of two reasons, 1. It's 99% of what I use, every console I own is RGB modified as well, including my SNES which I can reference as well. 2. It was a very high % of what I saw other people using, maybe because of the places I hung out on Discord, but either way, a large majority I saw had PVMs/BVMs or modified consumer TVs (or they were for Europe and had SCART already).
As for picking the SNES core. For one, its not perfect, it has a Gamma S Curve to it, but I call that out and acknowledge it as well as look for how the DAC's follow it. Knowing the issue means you can assume a good DAC should follow that S curve as well, essentially follows the good and the bad. Second, of all the cores, it has the best 240p Test Suite application with 0-100 IRE tests in 10 IRE increments, no other console/240p test suite did that from my initial testing of Mister supported cores. Third, its readily available, easy for others to test, and can test the real console to verify easily as well. I wanted others to be able to do these tests and confirm my findings as well (which others have done). So that's why I didn't test composite, and no it doesn't matter, the goal was to verify RGB output. I will admit I haven't done extensive YPbBr testing so I can't speak to how it should perform right now... It has always been my plan to do a thorough analysis of it, but I don't have the time right now.. But for RGB you can read about everything I did here, https://tinyurl.com/dacanalysis. I think it might answer a lot of your questions a well. At the end of all of this, the SNES while not perfect is a good core to use as a test subject like you have said. On the subject of compatibility, over the course of dozens of people testing, the only core that ever really had issues was the CAVE core, but that has since been fixed. There very well could be some outliers, but the vast majority of cores and reports from users have stated no other major issues.
Regarding the DAC testing, did they have hdmi_limited=0 or hdmi_limited=2 set in their MiSTer.ini? You can see that in the component voltage tests the reference range is 16-255, so I hope hdmi_limited=2 was used for these tests of the AG620x since that option was added specifically to address this. If hdmi_limited=0 was set then this would explain the crushed blacks on the AG620x, and if that were the case then that is just a case of user error. The testing should be updated using the proper intended settings if that is the case.
hdmi_limited was always set to 0. A software hack to make a DAC act like a good DAC is just that, a hack.. while it may work, that DAC is still flawed and may contribute to other issues or errors as you expand or collapse the range.. not to mention if you try to use it with something else, like a PS2, it will still be flawed... people shouldn't waste their money on a flawed DAC imo when better ones are available.. and asking them to pay 10-15 more dollars for a better DAC is really not an issue I feel when people spend $100 bucks on an Analog I/O board that is bad as well if they are using the reference 6.1 board. But to your point, I have told people if they want to do that more power to them.. just know your DAC still isn't working according to reference specs, I am not here to tell people how to enjoy their games ;)
On the point of component video being 7.14mV, please refer to my document where I call out how most PVM service manuals and technical calibration guides refer to YPbPr as being 7mV per IRE as well, just like RGB... I do admit this is a bit of a debatable topic, however if going by the reference manuals for my PVMs from Sony, they specifically call for a YPbPr voltage of 700mV at 100IRE. And yes, the AG6200 does start crushing blacks below 50IRE to the point they are essentially black by 10IRE.
If a dev, who understands the complexities, wants to give a serious shot at this and look at the PR and blanking, I'm more than happy to send them a RT4K (and I think this feature would eventually go beyond just the RT4K as well).
Group chat initiated. Thanks Mike!
We'll be submitting a PR with work from @mikechi2, @va7deo, and myself regarding metadata to be passed over direct video. A RetroTink4K is enroute to @va7deo. We'll work on this and probably have some progress over the holiday in conjunction with our planned holiday MiSTer release.
If a dev, who understands the complexities, wants to give a serious shot at this and look at the PR and blanking, I'm more than happy to send them a RT4K (and I think this feature would eventually go beyond just the RT4K as well).
Group chat initiated. Thanks Mike!
We'll be submitting a PR with work from @mikechi2, @va7deo, and myself regarding metadata to be passed over direct video. A RetroTink4K is enroute to @va7deo. We'll work on this and probably have some progress over the holiday in conjunction with our planned holiday MiSTer release.
Thanks a lot for helping move this positively!
hdmi_limited was always set to 0.
Thanks for confirming this! I appreciate the in-depth analysis from your google doc. I hope you one day provide measurement levels for one of the ag620x DACs with hdmi_limited=2 enabled by comparison, as that is why the option was added and is the advice we give people in the discord frequently. Would be interesting to see. :)
If a dev, who understands the complexities, wants to give a serious shot at this and look at the PR and blanking, I'm more than happy to send them a RT4K (and I think this feature would eventually go beyond just the RT4K as well).
Group chat initiated. Thanks Mike!
We'll be submitting a PR with work from @mikechi2, @va7deo, and myself regarding metadata to be passed over direct video. A RetroTink4K is enroute to @va7deo. We'll work on this and probably have some progress over the holiday in conjunction with our planned holiday MiSTer release.
Thank you!
I want to tell approximately what i expect: 1) in RGB mode direct video will output exact blanks from original video, so there is no need to send such info. 2) re-use some infoframe where MiSTer can send pixel repetition parameter. Need to find some meaningless infoframe which can be re-used, where about 8 bits can be sent. 8 bits should be enough to code the repetition.
What i DON'T want to see is HDMI stack implemented in HDL. This is definitely a wrong way! As i've mentioned above, it should be as tiny as possible.
I'm going to submit changes to Template for (1) and partially for (2).
There will be an issue with OSD. Currently OSD for VGA/DirectVideo doesn't use pixel clock but video clock. So even if core has low horizontal resolution, OSD may have higher resolution, so OSD was out of original core resolution. Since scaler needs consistent pixel repetition, OSD must follow core's pixel clock. So some cores like SNES will have very wide OSD. OSD needs 256x128 resolution, so on SNES core it will occupy whole width. There is another potential issue for cores where pixel clock can vary during the single frame. I'm going to make a special direct_video=2 mode where all these problems will be present but won't appear on original direct_video=1 mode.
Thank you Sorgelig! I am curious...once this is implemented, would you please clarify which video related settings in mister.ini will still work? I am assuming a majority of them will no longer be relevant since the internal scaler will be bypassed.
Thank you so much!!! Please let me know where the pixel repeat data is found and I'll test it right away. Suggestion: If a spare infoframe needs to be hi-jacked, perhaps the ACP (Audio Content Protection) infoframe as no one cares about it...
i'm using this field:
Great, so you're hi-jacking the VIC code, which is meaningless in direct video anyways.
Currently i'm modifying OSD code so it will be able to show at least partial OSD instead of nothing if horizontal resolution 256 pixels or less.
test this: SNES.zip
it includes both SNES core and Main (you can also build Main yourself from current sources).
in MiSTer.ini you need to use direct_video=2
I wonder if ADV7513 provides any way for embedding custom 4-bit PR value into AVI infoframe without using spare packets? On the other hand, some HDMI receivers automatically perform de-repetition based on the PR field and do not behave well unless all H timings (especially H total) are multiple of that which I don't think is the case with some cores if they accurately model source HW. In that case misusing VIC field makes more sense than following standards.
I included manual pixel de-repetition option in OSSC Pro, but obviously it makes users' life easier if the factor can be included in metadata.
test this: SNES.zip
it includes both SNES core and Main (you can also build Main yourself from current sources). in MiSTer.ini you need to use
direct_video=2
Thanks! I set direct_video=2 but I'm unable to get a stable video output in the main menu (so I can't yet launch the core). My installation might be very out of date, so I'm just updating everything now.
I wonder if ADV7513 provides any way for embedding custom 4-bit PR value into AVI infoframe without using spare packets? On the other hand, some HDMI receivers automatically perform de-repetition based on the PR field and do not behave well unless all H timings (especially H total) are multiple of that which I don't think is the case with some cores if they accurately model source HW. In that case misusing VIC field makes more sense than following standards.
I included manual pixel de-repetition option in OSSC Pro, but obviously it makes users' life easier if the factor can be included in metadata.
Yes, using the HDMI RX de-rep breaks the PSX core.
The output from the main menu seems to be at 12.09 MHz PCLK with unstable timings, reporting VIC 0.
Th Astro HDMI Protocol Analyzer doesn't seem to work at all with this signal.
The output from the main menu seems to be at 12.09 MHz PCLK with unstable timings, reporting VIC 0.
Th Astro HDMI Protocol Analyzer doesn't seem to work at all with this signal.
Put this in your MiSTer.ini at the end:
[Menu]
direct_video=1
That will make an exception for the main menu core so you can test the snes core with the new option instead.
If you run the update script it might replace the Main MiSTer binary again so you will need to replace it again with the one provided in the zip to test it.
Thanks @wizzomafizzo for showing me how to launch the snes core from the command line. Looks like the SNES core is transmitting VIC=4 in direct video=2.
The DE nicely switches between 224 and 239 lines as appropriate, so the video is perfectly cropped.
The only problem is that the very first active line appears to flicker (one frame normal, one frame all black). This was what @wickerwaka and I saw a few months ago when we tried something similar as well.
The output from the main menu seems to be at 12.09 MHz PCLK with unstable timings, reporting VIC 0. Th Astro HDMI Protocol Analyzer doesn't seem to work at all with this signal.
Put this in your MiSTer.ini at the end:
[Menu] direct_video=1
That will make an exception for the main menu core so you can test the snes core with the new option instead.
If you run the update script it might replace the Main MiSTer binary again so you will need to replace it again with the one provided in the zip to test it.
Cool, good to know.
There is no flicker here. I'm testing on HDMI->VGA converter. Probably something wrong in HDMI capturing chip or its settings. I even switched VGA cable from VGA output to HDMI-VGA converter and back - i get identical video including top and bottom lines. Nothing flickers. I suggest to get such HDMI-VGA converter, so you can compare results. It may help to debug.
It would be interesting to hear from OSSC or other scalers devs how new mode works.
Btw, in OSD -> Audio & Video -> Force 256px option you may switch between 512 and 256 pixels, VIC should dynamically switch between 4 and 8.
I scoped the actual output of the HDMI RX chip. You can see the first red line of the 240pTS grid flicker on and off. Yellow = VSYNC, Blue = Red MSB.
https://github.com/MiSTer-devel/Main_MiSTer/assets/23427169/78943f03-8674-41bb-a70d-933a9e7058ac
So the issue is somewhere in the HDMI TX -> HDMI RX rather than the scaler. I tested the exact timings on the PC with a custom modeline and did not have this issue.
I'll investigate further on my end as well. Seems like some sort of weird edge case.
Btw, in OSD -> Audio & Video -> Force 256px option you may switch between 512 and 256 pixels, VIC should dynamically switch between 4 and 8.
This works great
Here's some more info:
Yellow = VSYNC Blue = HDMI DE
It looks like the first line's data enable is missing every other frame. Maybe the VGA adapter ignores this glitch?
EDIT:
Also interestingly 480i output from the SNES core with this new mode is perfectly fine. Not sure what to make of that.
Some more test data:
I copied the exact timings into a Raspberry PI, and the output was stable. The top line did not glitch.
Same result when I copied the exact timings into a RT5X. This is notable since the RT5X also uses the ADV7513 HDMI transmitter. The only difference is that I use the ADV7513's sync/data enable regeneration features (?)
Also interestingly 480i output from the SNES core with this new mode is perfectly fine. Not sure what to make of that.
Maybe it's caused by the short scan line that the SNES sends every other frame to reduce composite shimmering? As far as I know in NTSC mode it is only generated in progressive but not interlaced modes. On a real console it would be the last line though, not the first and I haven't looked at the core sources to check how it is implemented there.
The SNES core outputs the short line at line 240. That should be the correct one according to the FullSnes docs by Nocash.
Ok. I will make another core for test. Probably it's SNES specific glitch
try this: MegaDrive.zip
try this: MegaDrive.zip
Thank you!
Unfortunately this one also glitches, but I do not think the root cause is the same as the SNES core. The data enable randomly shifts between 224 and 223 lines. This affects both 240p and 480i modes.
The VIC/Decimation code does switch from 16 in 320-px mode to 20 in 256-px mode which is very cool!
I've pushed my changes to MegaDrive repo in order if someone want to tweak it. I think retro systems never were strictly following TV standards and had some quirks. I don't know if those glitches are related to ADV7513 or not. MISTer scaler gets the same video and processes it without such glitches.
It seems my HDMI->VGA converter has stability issues with bogus VIC so perhaps misusing this field is not that good idea after all. With vanilla MegaDrive core or dvi_mode=1 option (no HDMI metadata) it works fine.
It seems my HDMI->VGA converter has stability issues with bogus VIC so perhaps misusing this field is not that good idea after all. With vanilla MegaDrive core or dvi_mode=1 option (no HDMI metadata) it works fine.
That's why i've made direct_video=2 mode. Use original direct_video=1 option for HDMI->VGA converter
@mikechi2 How about keep direct video mode 1 but send pixel offset, line offset, width, height, pixel repetition in info frame? Can you count clock cycles from hsync and vsync?
@mikechi2 How about keep direct video mode 1 but send pixel offset, line offset, width, height, pixel repetition in info frame? Can you count clock cycles from hsync and vsync?
I think this solution would be best.
Yes no problem counting offsets, we already do it for some other things.
How about setting active Bar Information to none (AVI_DB1[3:2]=0) and using AVI_DB6 to AVI_DB13 for custom metadata? I'm not sure why anything else than PR factor would need to be communicated, though.
Ok. New version of direct video. Video parameters now are in SPD info frame. See here for format: https://github.com/MiSTer-devel/Main_MiSTer/blob/18b2ee21e994d99377ea590cf4ab3340f76f9181/video.cpp#L1542-L1555
notes:
zip includes both core and Main. Main code is already committed while core is not yet (so use included rbf).
de_h
is de offset from fall of HSync in pixel clock cycles (not video clock cycles).
de_v
is de offset from fall of VSync in lines.
At the end on structure i've added the name of core up to the end of infoframe. Max 9 chars of core name will fit which should be enough to distinguish the cores, i think.
At least ADV7610 has trouble locking to the signal, possibly due to the DE issue reported earlier. Below are relevant register values when running above MD core with direct_video=1 (V timings seem off):
advrx h_total: 6840
advrx h_synclen: 616
advrx h_backporch: 64
advrx h_active: 6160
advrx v_total: 253
advrx v_synclen: 3
advrx v_backporch: 8186
advrx v_active: 253
advrx sync polarities: H(-) V(-)
advrx interlace_flag: 0
advrx pclk: 107390625Hz
Ok. New version of direct video. Video parameters now are in SPD info frame. See here for format:
notes:
- no more direct_video=2 mode since it doesn't conflict anymore. So use direct_video=1
- in this version i don't align OSD to core's pixels. It's ugly and for some cores OSD simply won't fit. So scaler has to decide either it temporarily switches to video clock and show good looking OSD when it present, or stick to core's pixel clock with ugly OSD. OSD presence is reflected in SPD info as well.
zip includes both core and Main. Main code is already committed while core is not yet (so use included rbf).
Thanks, I will give this a try.
At least ADV7610 has trouble locking to the signal, possibly due to the DE issue reported earlier. Below are relevant register values when running above MD core with direct_video=1 (V timings seem off):
advrx h_total: 6840 advrx h_synclen: 616 advrx h_backporch: 64 advrx h_active: 6160 advrx v_total: 253 advrx v_synclen: 3 advrx v_backporch: 8186 advrx v_active: 253 advrx sync polarities: H(-) V(-) advrx interlace_flag: 0 advrx pclk: 107390625Hz
The ADV7610, which I assume is a close cousin of the ADV7611 (which is what I use) series is a piece of junk. As far as I can tell, it won't even lock on to SNES (no de-jitter) -> OSSC.
Also a lot of stupid stuff going on with their "component processor" which imo. is wholly unnecessary for a HDMI -> digital output.
It's more or less the same silicon. I used it on early prototypes too but switched to ADV7610 for multichannel LCPM audio support. It did not indeed play nice with SNES core in 240p mode (tried earlier the variant with direct_video=2), but I assume a de-jitter compatibility option should be anyway added to the core if its direct video is to be used with HDMI receivers.
Ok. New version of direct video. Video parameters now are in SPD info frame. See here for format:
notes:
- no more direct_video=2 mode since it doesn't conflict anymore. So use direct_video=1
- in this version i don't align OSD to core's pixels. It's ugly and for some cores OSD simply won't fit. So scaler has to decide either it temporarily switches to video clock and show good looking OSD when it present, or stick to core's pixel clock with ugly OSD. OSD presence is reflected in SPD info as well.
zip includes both core and Main. Main code is already committed while core is not yet (so use included rbf).
I did quick test and can verify that this MD core has stable video and I'm seeing the Pixel Repetition toggle from 16 and 20. I will verify the rest of the SPD infoframe next.
but I assume a de-jitter compatibility option should be anyway added to the core if its direct video is to be used with HDMI receivers.
in new version DE is the same as originally in direct_video=1 mode. So any irregularity of DE size should be irrelevant already. Scaler gets SPD info and also core name to as a hint. Thus scaler may add additional tweaks if required.
but I assume a de-jitter compatibility option should be anyway added to the core if its direct video is to be used with HDMI receivers.
in new version DE is same as originally in direct_video=1 mode. So any irregularity of DE size should be irrelevant already. Scaler gets SPD info and also core name to get more hints. Thus scaler may add additional tweaks if required.
De-jitter isn't needed with the SPD metadata since the ADV7611 (and presumably ADV7610 too) only barfs on the first DE, so the glitch is on a blank line.
MegaDrive in some video modes has variable pixel size. If i remember right at least from the end of HSync to the end of active portion of line, pixel clock should be the same.
ADV7610 started to report v_total correctly when I set NEW_VS_PARAM=0 (opposite to recommended ADI setting). Is there still some reason DE signal can't be used normally as opposed to specifying intended DE offsets and active window size in SPD infoframe?
Is there still some reason DE signal can't be used normally as opposed to specifying intended DE offsets and active window size in SPD infoframe?
As tests above shown each core has its weirdness. Generally speaking all retro systems were made for analog video where DE signal isn't present. In all cores DE signal is re-generated or pulled from internal signals. So, some systems may generate uneven DE. Also it seems either ADV7513 or HDMI receiver have quirks. The whole DirectVideo is kind out of spec. As an additional benefit of current DE implementation - it can coexist with original direct_video implementation.
ADV7610 started to report v_total correctly when I set NEW_VS_PARAM=0 (opposite to recommended ADI setting).
so, now you can capture perfectly framed video when you apply SPD parameters?
Hi developers, I'm reaching out to you requesting the possibility of adding metadata information through direct video so that external hardware such as scalers can use that information. A use case is using a retrotink 4k where it takes metadata with the direct video on how to exactly crop and scale the game. Thanks