Sometimes I want to change my GBS-C's output resolution, before I turn on my console to produce a video signal. Currently, this sends the GBS-C into a fallback state where it turns on debug mode and turns off sync watcher:
How did I find this out? One time I was using my GBS-C normally, booted up a game, and wondered I had to turn my CRT's brightness way down for acceptable black levels. I had to factory-reset my GBS-C before I realized that debug mode was causing the problem. The second time I encountered high black levels again, I realized I had never turned on debug mode but the GBS-C did it by itself, so I read the source code to find out what was triggering debug mode.
After turning off debug mode and playing some games, I switched from a 480p to 240p video source, which failed to sync for over 10 seconds on end. I thought this was a new failure mode of mode changes, until I realized that sync watcher was flipped off even though I had never clicked the button myself. Reading the source code again, I found that switching output resolutions without no input also disabled the sync watcher and switched to RGB input.
In my use case, "stop switching output resolutions with your console powered off" is not an adequate workaround:
Bypass mode works for 480p input signals, has marginally lower latency than 480p/960p scaling, and a wider horizontal size than 480p output. Switching to 480p scaling and reconfiguring the GBS-C to output full-width video tends to cause artifacts and corruption.
Sometimes I switch from Bypass mode to 960p output because I'm playing a 3D game which looks bad with 480p scanlines on a VGA monitor (high-frequency aliasing artifacts are more visible with scanlines visible, than in 960p with line filter enabled).
Or I may switch to 480p mode (and reconfigure my monitor with a wider geometry), when playing a SNES game with a 15 KHz signal, which my monitor cannot show in bypass mode.
Switching my GBS-C to bypass mode after turning on the SNES forces my monitor into an "incompatible display mode" screen. I don't know if this damages my monitor. To avoid this screen, I have to power down my monitor before I turn on the console, then switch the GBS-C resolution before turning on my monitor again.
If I turn on the Wii, it takes multiple seconds before it starts outputting a video signal. If I click an output resolution too soon, I can even trigger debug mode by mistake.
Does this problem affect other people?
This issue has bit me many times, and working around it makes the GBS-C more awkward to use. Perhaps this problem is less relevant to other people, if they hook the GBS-C up to a LCD monitor, which doesn't have dramatic changes in appearance between 480p and higher resolutions, and run the scaler in 960p/1080p output at all times.
Does anyone else encounter this behavior when changing resolutions and want it fixed?
Does changing this behavior break workflows?
Why was this behavior added? Is there a use case for altering this many flags when switching output resolutions? So far I've never found it helpful at all.
The behavior to enable debug view typeOneCommand = 'D' seems to have been added in commit c9c25f6d5036df5fb57f5574c2a9fd6bbbb962c5 "better handling of non-valid inputs when user loads a preset;".
Disabling the sync watcher rto->syncWatcherEnabled = 0 predates that commit. It seems to originate from commit a1d2e6e946a5cbc6c6c0438b979f6d596cf23663 "deal with unknown (but stable) sources: new mode 9 (copy of mode 3) at least displays something", and was added at line 3459.
I don't know if anyone currently depends on this behavior, or finds it useful for debugging.
How to fix?
Obviously I wanted to remove rto->syncWatcherEnabled = 0 and serialCommand = 'D'. I also removed GBS::ADC_INPUT_SEL::write(1); // RGB since I don't want to change which input to receive signal from, but rescan both inputs as usual.
After removing rto->syncWatcherEnabled = 0, we now get a green screen after switching output resolutions with no input. This is a new bug. Instead I want the GBS-C to shut off the output signal, much like how setup() behaves upon boot time, or when the source signal disappears and runSyncWatcher() eventually shuts off the output signal. I found that both scenarios call setResetParameters() to shut off the output.
So I decided to call setResetParameters() here. (I don't know if you should call goLowPowerWithInputDetection() instead, but it's called from loop() just afterwards.).
You cannot call doPostPresetLoadSteps() after setResetParameters(), since doPostPresetLoadSteps does not expect rto->videoStandardInput == 0 and prints "!should not go here!" to the console. You can call it before setResetParameters(), but this causes the output to turn on for a few seconds even if it was off, which I don't want to happen (I want changing resolutions with no input signal to turn off the display and start searching for an input signal). So I removed doPostPresetLoadSteps() altogether.
Without applying the preset, I decided there wasn't a point in writeProgramArrayNew(ntsc_240p, false); or doPostPresetLoadSteps() since we aren't inputting or outputting any signal at all. (Note that I don't fully understand what either function does, or how it influences the chip.) And because setResetParameters() sets rto->videoStandardInput = 0, I removed the writes to that variable as well.
In the end I was left with setResetParameters(); return;. I don't know if this behavior is correct or not, but it behaves well in my usage so far. If other people are affected by this issue and want this change as well, I can make a pull request. But if I'm the only one who regularly loads presets or changes output resolutions, I guess I'll keep this as a local change to avoid altering existing code and risking introducing new bugs I haven't found yet.
Sometimes I want to change my GBS-C's output resolution, before I turn on my console to produce a video signal. Currently, this sends the GBS-C into a fallback state where it turns on debug mode and turns off sync watcher:
https://github.com/ramapcsx2/gbs-control/blob/499fcfc0e9d52864a4575c93407fcfc9906cfcdc/gbs-control.ino#L4172-L4180
How did I find this out? One time I was using my GBS-C normally, booted up a game, and wondered I had to turn my CRT's brightness way down for acceptable black levels. I had to factory-reset my GBS-C before I realized that debug mode was causing the problem. The second time I encountered high black levels again, I realized I had never turned on debug mode but the GBS-C did it by itself, so I read the source code to find out what was triggering debug mode.
After turning off debug mode and playing some games, I switched from a 480p to 240p video source, which failed to sync for over 10 seconds on end. I thought this was a new failure mode of mode changes, until I realized that sync watcher was flipped off even though I had never clicked the button myself. Reading the source code again, I found that switching output resolutions without no input also disabled the sync watcher and switched to RGB input.
In my use case, "stop switching output resolutions with your console powered off" is not an adequate workaround:
Does this problem affect other people?
This issue has bit me many times, and working around it makes the GBS-C more awkward to use. Perhaps this problem is less relevant to other people, if they hook the GBS-C up to a LCD monitor, which doesn't have dramatic changes in appearance between 480p and higher resolutions, and run the scaler in 960p/1080p output at all times.
Does anyone else encounter this behavior when changing resolutions and want it fixed?
Does changing this behavior break workflows?
Why was this behavior added? Is there a use case for altering this many flags when switching output resolutions? So far I've never found it helpful at all.
The behavior to enable debug view
typeOneCommand = 'D'
seems to have been added in commit c9c25f6d5036df5fb57f5574c2a9fd6bbbb962c5 "better handling of non-valid inputs when user loads a preset;".Disabling the sync watcher
rto->syncWatcherEnabled = 0
predates that commit. It seems to originate from commit a1d2e6e946a5cbc6c6c0438b979f6d596cf23663 "deal with unknown (but stable) sources: new mode 9 (copy of mode 3) at least displays something", and was added at line 3459.I don't know if anyone currently depends on this behavior, or finds it useful for debugging.
How to fix?
Obviously I wanted to remove
rto->syncWatcherEnabled = 0
andserialCommand = 'D'
. I also removedGBS::ADC_INPUT_SEL::write(1); // RGB
since I don't want to change which input to receive signal from, but rescan both inputs as usual.After removing
rto->syncWatcherEnabled = 0
, we now get a green screen after switching output resolutions with no input. This is a new bug. Instead I want the GBS-C to shut off the output signal, much like howsetup()
behaves upon boot time, or when the source signal disappears andrunSyncWatcher()
eventually shuts off the output signal. I found that both scenarios callsetResetParameters()
to shut off the output.So I decided to call
setResetParameters()
here. (I don't know if you should callgoLowPowerWithInputDetection()
instead, but it's called from loop() just afterwards.).You cannot call
doPostPresetLoadSteps()
aftersetResetParameters()
, sincedoPostPresetLoadSteps
does not expectrto->videoStandardInput == 0
and prints"!should not go here!"
to the console. You can call it beforesetResetParameters()
, but this causes the output to turn on for a few seconds even if it was off, which I don't want to happen (I want changing resolutions with no input signal to turn off the display and start searching for an input signal). So I removeddoPostPresetLoadSteps()
altogether.Without applying the preset, I decided there wasn't a point in
writeProgramArrayNew(ntsc_240p, false);
ordoPostPresetLoadSteps()
since we aren't inputting or outputting any signal at all. (Note that I don't fully understand what either function does, or how it influences the chip.) And becausesetResetParameters()
setsrto->videoStandardInput = 0
, I removed the writes to that variable as well.In the end I was left with
setResetParameters(); return;
. I don't know if this behavior is correct or not, but it behaves well in my usage so far. If other people are affected by this issue and want this change as well, I can make a pull request. But if I'm the only one who regularly loads presets or changes output resolutions, I guess I'll keep this as a local change to avoid altering existing code and risking introducing new bugs I haven't found yet.