Closed nyanpasu64 closed 1 year ago
void setCsVsStart(uint16_t start)
{
GBS::SP_SDCS_VSST_REG_H::write(start >> 8);
GBS::SP_SDCS_VSST_REG_L::write(start & 0xff);
}
void setCsVsStop(uint16_t stop)
{
GBS::SP_SDCS_VSSP_REG_H::write(stop >> 8);
GBS::SP_SDCS_VSSP_REG_L::write(stop & 0xff);
}
Why aren't these registers properly saved and restored by the preset binary system? Will have to investigate.
doPostPresetLoadSteps()
prepareSyncProcessor(); // todo: handle modes 14 and 15 better, now that they support scaling
GBS::SP_SDCS_VSSP_REG_L::write(1); // 5_40 // was 11
if (!rto->isCustomPreset) {
... setCsVsStop(13);
GBS::SP_SDCS_VSSP_REG_L::write(stop & 0xff);
The problem is, every time we load a preset, we call prepareSyncProcessor()
to reset SP_SDCS_VSSP_REG_L
, then every time we load a fixed resolution, we call setCsVsStop
(technically twice, the second call overrides the first) based on the input resolution.
The fix is for either prepareSyncProcessor()
to not reset SP_SDCS_VSSP_REG_L
etc. when loading a custom preset, or to call setCsVsStop
even when loading a custom preset.
I don't know if it's safe to skip prepareSyncProcessor()
altogether when loading a custom preset, or if we need to add a bool regsInitialized
to prepareSyncProcessor()
as well as setOutModeHdBypass()
.
I'd imagine it's not safe to remove the if (!rto->isCustomPreset) {
condition altogether, but I don't know which code is safe to move out of that block. Or maybe we shouldn't do this at all, and we should be overwriting SP_SDCS_VSSP_REG_L
zero times when loading custom presets, not twice.
Who currently calls prepareSyncProcessor
?
RGBHVNoSyncCounter > limitNoSync
-> prepareSyncProcessor
. VGA input only?Who currently calls setCsVsStop
?
setCsVsStart/setCsVsStop
when passing through signals with line count outside the NTSC/PAL range?void doPostPresetLoadSteps() -> prepareSyncProcessor
, then ~1.2 seconds later void setOutModeHdBypass() -> setCsVsStop
, rather than the runSyncWatcher { rto->outModeHdBypass { setCsVsStart
path.When loading a fixed resolution, or when the input switches NTSC/PAL/EDTV resolutions, void doPostPresetLoadSteps() -> prepareSyncProcessor
is followed by 0-2 void doPostPresetLoadSteps() -> setCsVsStop
calls depending on input resolution. (Neither prepareSyncProcessor
nor setCsVsStop
is called when switching between 240p and 480i.)
When the console powers off and the GBS-C gives up watching input resolution, void goLowPowerWithInputDetection() -> prepareSyncProcessor
is called.
At this point, we could either make loading a saved preset more like a fixed resolution, by calling both void doPostPresetLoadSteps() -> prepareSyncProcessor
and void doPostPresetLoadSteps() -> setCsVsStop
. Or we could make it preserve the saved settings, by not calling prepareSyncProcessor
or by making it skip initializing SP_SDCS_VSSP_REG_L
etc.
prepareSyncProcessor
when loading save?[ ] Is it safe to skip prepareSyncProcessor
if we've already run it and saved its results? This would require that all registers written are not overwritten (unless the new value is the intended one), saved in the preset, and loaded properly, and the act of writing a register does not trigger any necessary processes.
I think (not confident) all registers written in prepareSyncProcessor
live in page 5, and all of page 5's registers are saved (<= 0x6F
) except for the unused VERYWIDEDUMMYREG
.
I've set up a build which only calls prepareSyncProcessor
if !isCustomPreset
. The result can successfully load 480p and 960p saved presets, at the right vertical position and with no ill effects observed (in light testing). I've tested loading different output resolution presets, starting from the same fixed resolution, the same custom preset, a different output resolution, different input resolution, a cold boot, or resetting the ESP8266 after selecting (but not applying) a different custom preset. (Not all combinations, but at least one saved preset for each starting point).
prepareSyncProcessor
again just to verify all registers written to by the function are saved in custom presets.prepareSyncProcessor
but skipping SP_SDCS_VSSP_REG_L
etc. only. This is the "safe way out" which minimizes change in behavior and minimizes the risk of introducing bugs, but isn't the way you would've written the code had you implemented it from the start supporting saving and loading presets, as well as sync parameters varying based on input resolution.(I think a good change to code should leave the code looking like it was designed that way from the beginning, but this isn't possible when requirements change, adjusting the code optimally would require months of rearchitecting, and any practical fix leaves behind tech debt.)
But at this point I'm concerned that #438 also moves isCustomPreset
to rto->isCustomPreset
, so I should ideally merge that PR before making a PR for this bug (and if I start programming a fix locally first, I'll have to rewrite it once I rebase on that PR).
setCsVsStop
when loading save?This will "fix" existing saved presets which have already been loaded and saved, "committing" the incorrect SP_SDCS_VSSP_REG_L
to flash. But it will also unconditionally change the meaning of existing presets (whereas not calling prepareSyncProcessor
will only change the meaning of existing presets which haven't been loaded incorrectly and resaved). I'm not sure skipping prepareSyncProcessor
(resulting in inconsistent behavior) is a better user experience.
Is this even easy to implement? setCsVsStop
is stuck in the middle of complex branched logic which only runs on fixed resolutions. It would be difficult to run one statement in the middle of branching logic, but only if (rto->presetID != 0x06 && rto->presetID != 0x16)
, while skipping the other statements leading up to this statement.
But why do we even need to write these registers rather than storing their values in the fixed preset? Is it because each input resolution needs a different initial config, applied when resetting state but not applying saved state?
Rather than sequentially mutating registers, I'd kinda prefer "declarative code" that takes a register map separate from the hardware chip, edits it to shape given the current input resolution, then returns it by value to a single function which applies it in a single go. Then separately you add code which juggles the chip registers required to "apply" the new configuration (eg. latchPLLAD()
). This produces a natural separation between code to generate a clean-slate preset for a fixed input and output resolution, and code to apply this configuration on an input signal.
SP_SDCS_VSSP_REG_L
belong? Where does fixed/auto gain belong? I think these are the hard questions, which you may have attempted to address before, but (IMO) is unclear from reading the code, and this bug shows that you had an inconsistent view of whether vertical position (relative to input sync) is a saved property of a preset or "derived state" calculated based on apply-time parameters.Semi related: Functional programming in C++, with an emphasis on "by value" semantics rather than generic abstraction. I don't know if the code can be improved by inlining more functions, or if you've reached the limits of this technique.
I do find that a lot of GBS-C's complexity and difficulty lies in properly configuring hardware registers filled with implementation details, with both hardware and software trying to satisfy a combinatorial explosion of possible use cases resulting in unclear behavior. For example, I've tried to produce full-width 480p output matching standardized DMT hactive period, but horizontal downscaling causes aliasing in 640px-wide input, and properly reconfiguring the input sampling htotal has eluded me so far. Improving architecture is more likely to prevent bugs in state management, business logic, and corner cases (including software race conditions, but I am under no illusions that I can fix hardware race conditions in ASICs I don't even understand).
I have a GBS-C hooked up to a 480p component video source (Wii), and outputting a VGA RGBHV image.
When I click "480p 576p" (fixed resolution), then a custom preset slot, then "save preset" and "load preset" (which should load the exact 480p fixed resolution as a custom preset), the image is shifted downwards by exactly 6 scanlines (in 960p output).
(The same thing happens if I skip "load preset", and instead power-cycle the GBS-C or (I think) switch the Wii to 240p/480i, which automatically load the preset.)
Details
I'm running a GBS-Control with debug pin and clock generator, receiving video from a component 480p Wii, and sending video to a VGA CRT (Gateway VX720). My CRT is currently calibrated so the image is properly sized and centered when I click the "480p 576p" fixed resolution button.
When I click a fixed resolution, then "Get Video Timings", I see:
When I save and load this resolution (which should be a no-op), then
CsVS_ST/SP
changes:I can use "Picture Control" to shift the image back up to being centered vertically. But this doesn't restore
CsVS_ST/SP
, but instead changesIF VB ST/SP
as well:I think that it's a bug that GBS-C writes different video registers and offsets the image vertically, if you load a fixed resolution vs. if you load the same resolution saved as a custom preset.
Debugging
I'm assuming that the differently written
CsVS_ST/SP
is the problem, which is plausible since these registers control vertical syncing. I looked in the code to see what accesses these registers. It appears theCsVS_ST/SP
printout comes fromgetCsVsStart()
andgetCsVsStop()
, which get the value fromSP_SDCS_VSST_REG_L
(_H
is zero) andSP_SDCS_VSSP_REG_*
(_H
is zero).Who sets
SP_SDCS_VS{ST,SP}_REG_L
to 16/13 or 4/1? Do they come fromwriteProgramArrayNew
,setCsVsStart/setCsVsStop
, or some other function?My best guess is in
doPostPresetLoadSteps
:So apparently GBS-C writes different register contents based on whether you're loading a fixed resolution or custom preset? Is this intentional or not?
Honestly the entire fixed/custom resolution system seems confusing and questionably designed to me. Unfortunately, GBS-C and the TV5727 chip have an immense amount of complexity and functionality (much of which I can't test, like PAL or RGB inputs), to the point I don't know how the converter handles (and should handle) all cases by setting up registers, or how to correctly fix the program.
As a first step, I'd change
presetPreference
andpresetID
toenum [class] Name : uint8_t
, but this is a good deal of work, it's easy to accidentally replace numbers with the wrong enum value name (there ought to be a compiler plugin to automate this), and it doesn't single-handedly explain the program's functioning.Misc
CsVT
(STATUS_SYNC_PROC_VTOTAL
) = 524? Is the input signal 524-line instead of 525? (VT
is 525 when outputting 480p and 1000 when outputting 960p, matching DMT.)Apparently the GBS-C remembers if you last clicked a fixed resolution or custom preset, and uses that when switching inputs or rebooting. But clicking a fixed resolution takes effect immediately, whereas clicking a custom preset does nothing until you load, switch input formats, or reboot. And if you click a custom preset followed by a fixed resolution, the custom preset choice has no effect to the video before or after power-cycling (the fixed resolution overrides the preset altogether), except for controlling the browser's highlighted preset, what the "load/save preset" buttons do, and fixing the "Settings tab grayed out" bug, unless I missed some hidden functionality.
Running latest GBS-C master 96e67732af167fe25966bace8557bd19cf562300. The bug was discovered on my local firmware fork (https://github.com/nyanpasu64/gbs-control), and persists across wiping the entire flash (erasing custom preset data and names, and the connected Wi-Fi network) and checkout/build/flashing stock firmware.