Closed cybork42 closed 1 year ago
Clipping and BPF appear to default to true in the configuration file, which should currently match the defaults on the Codec2 side IIRC.
Speaking of Codec2, clipping is valid for all OFDM modes as you state but BPF is only available for 700D and 700E:
void freedv_set_clip(struct freedv *f, int val) {
f->clip_en = val;
if (is_ofdm_mode(f)) {
f->ofdm->clip_en = val;
/* really should have BPF if we clip */
if (val)
ofdm_set_tx_bpf(f->ofdm, true);
}
}
/* Band Pass Filter to cleanup OFDM tx waveform, only supported by some modes */
void freedv_set_tx_bpf(struct freedv *f, int val) {
if (FDV_MODE_ACTIVE( FREEDV_MODE_700D, f->mode) || FDV_MODE_ACTIVE( FREEDV_MODE_700E, f->mode)
|| FDV_MODE_ACTIVE( FREEDV_MODE_DATAC0, f->mode) || FDV_MODE_ACTIVE( FREEDV_MODE_DATAC3, f->mode)) {
ofdm_set_tx_bpf(f->ofdm, val);
}
}
Thus my suggestions:
Does that seem reasonable?
Sounds very much like a great plan to me! (Apparently I didn't understand the BPF Code well enough...)
Am 17. Dezember 2022 08:22:53 MEZ schrieb Mooneer Salem @.***>:
Clipping and BPF appear to default to true in the configuration file, which should currently match the defaults on the Codec2 side IIRC.
Speaking of Codec2, clipping is valid for all OFDM modes as you state but BPF is only available for 700D and 700E:
void freedv_set_clip(struct freedv *f, int val) { f->clip_en = val; if (is_ofdm_mode(f)) { f->ofdm->clip_en = val; /* really should have BPF if we clip */ if (val) ofdm_set_tx_bpf(f->ofdm, true); } } /* Band Pass Filter to cleanup OFDM tx waveform, only supported by some modes */ void freedv_set_tx_bpf(struct freedv *f, int val) { if (FDV_MODE_ACTIVE( FREEDV_MODE_700D, f->mode) || FDV_MODE_ACTIVE( FREEDV_MODE_700E, f->mode) || FDV_MODE_ACTIVE( FREEDV_MODE_DATAC0, f->mode) || FDV_MODE_ACTIVE( FREEDV_MODE_DATAC3, f->mode)) { ofdm_set_tx_bpf(f->ofdm, val); } }
Thus my suggestions:
- "700D TX Band Pass Filter" becomes "700D/E TX Band Pass Filter" in the GUI.
- "FreeDV 700 Options" becomes "Modem Options".
Does that seem reasonable?
-- Reply to this email directly or view it on GitHub: https://github.com/drowe67/freedv-gui/issues/317#issuecomment-1356084656 You are receiving this because you authored the thread.
Message ID: @.***>
The Options -> Modem dialog suggests that "Clipping" and "700D TX Band Pass Filter" are "FreeDV 700 Options". During our last FreeDV technical meeting we all seemed to perceive that clipping is active at least for all OFDM modes, isn't it?
Yes - clipping (compression) and the associated BPF are now supported by all OFDM modes at the ofdm.c level.
freedvi-gui
dialog? Might be a source of confusion.I believe having the choice to use or disable clipping is valuable.
With clipping enabled you never reach the maximum possible SNR. The only way to ever see this is with clipping disabled! For example, when recording local, or from a remote e. g. with a kiwi SDR, you would start to wonder why you never reach max SNR even with the strongest signal. You disable clipping, and voilà, there you are.
On the other hand the 4dB or so advantage are true, too.
Created #319 for the labeling adjustment. Further changes will happen once Codec2 is updated.
Merged the PR and closed for now. Once the codec2 PR gets merged, we can revisit any additional labeling changes.
I'm working on https://github.com/drowe67/codec2/pull/368. For consistency, I'm inclined to switch the BPF and clipping (compression) on by default for all OFDM modes at the codec2 level in ofdm_mode.c
Would that be OK with you @tmiw & @cybork42 ?
I'm working on drowe67/codec2#368. For consistency, I'm inclined to switch the BPF and clipping (compression) on by default for all OFDM modes at the codec2 level in
ofdm_mode.c
Would that be OK with you @tmiw & @cybork42 ?
Fine by me.
I believe I commencted earlier to @tmiw that switching on "compression" by default will never allow a complete codec and modulation system to achieve optimum SNR. Compression to me is a crutch to help increase PAPR and it works well in real-life, but if you try to understand qhat's happening and why you never get maximum "decoding performance" you will dig long before you find that the signals got compromised, albeit with good intention.
I would recommend to leave both switched off with strong hints on the gui side of things recommending to use both settings rather than have the default code (which might be reused by someone else) default to use both functions actively.
But this is just my (engineering) attitude. The bottomline performance no question improves with both settings active.
Am 26.12.2022 um 22:56 schrieb drowe67:
I'm working on drowe67/codec2#368 https://github.com/drowe67/codec2/pull/368. For consistency, I'm inclined to switch the BPF and clipping (compression) on by default for all OFDM modes at the codec2 level in |ofdm_mode.c|
Would that be OK with you @tmiw https://github.com/tmiw & @cybork42 https://github.com/cybork42 ?
— Reply to this email directly, view it on GitHub https://github.com/drowe67/freedv-gui/issues/317#issuecomment-1365479184, or unsubscribe https://github.com/notifications/unsubscribe-auth/APY6YIVSVATUQZDBZNNTDQLWPIIBXANCNFSM6AAAAAATBTP4ZQ. You are receiving this because you were mentioned.Message ID: @.***>
Hi @cybork42 - actually I feel the (communications system) engineering position is the opposite. For a given peak power, the SNR at the Rx is improved, as the average power (S) is increased, while N remains the same.
Unfortunately the SNR estimator gets confused by to the controlled distortion compression introduces, so it reads incorrectly. It is messy in that sense - it would be nice to have a PAPR reduction scheme that doesn't mess with the SNR estimator (or a better SNR estimator).
The use of FEC will result in zero errors despite the compression, so decoding performance is also fine with compression (indeed 4dB better).
It's important to understand that the SNR estimate returned by the modem is an estimate - don't confuse it with the actual channel SNR. The modem SNR estimator is not a test instrument ... it makes a few assumptions that will break at times.
BTW I've generated some curves recently, to support Simon's FreeDATA work
@tmiw - just confirming DPSK has been removed form freedv-gui options? I can't see it on the Options menu. IIRC we spoke about this a few weeks ago. If so I might start to remove codec2 support for it.
Also we probably don't need the "700D manual unsync" - IIRC that was due to sync issues in the very early days of 700D.
Dear @drowe67 - I totally agree to your line of thought. I haven't looked into the way the "modem SNR" display gets calculated (shame on me), but had the inherent impression it might be based on the extent of errors detected (and corrected). But I might be wrong here. But if this is the case, how can one "approve" a method to increase the transmission raw BER just to increase the physical SNR a bit?
I mean, OFDM's major backdraw is the poor PAPR. It's not a codec fault, its a modem fault.
Anyway, from a communications engineer's standpoint (same here... but switched subjects long time ago :-) )there is a codec, a channel coding and a modulation scheme. The modulation scheme creates issues (poor PAPR), and of course by relying on the capabilities of the channel coding, you can squeeze the modem PAPR while relying on the channel coding to get things right still. Channel coding is not intended to catch intentional poor modulation but to compensate channel effects, right? And we have nasty channels, don't we?
If the modem SNR display gets derived from the amount of bits corrected by channel coding, allowing the modem to introduce modulation flaws, ... things are getting messier and more intertwined as they should be. Or not?
Am 26.12.2022 um 23:35 schrieb drowe67:
Hi @cybork42 https://github.com/cybork42 - actually I feel the (communications system) engineering position is the opposite. For a given peak power, the SNR at the Rx is improved, as the average power (S) is increased, while N remains the same.
Unfortunately the SNR estimator gets confused by to the controlled distortion compression introduces, so it reads incorrectly. It is messy in that sense - it would be nice to have a PAPR reduction scheme that doesn't mess with the SNR estimator (or a better SNR estimator).
The use of FEC will result in zero errors despite the compression, so decoding performance is also fine with compression (indeed 4dB better).
It's important to understand that the SNR estimate returned by the modem is an /estimate/ - don't confuse it with the actual channel SNR. The modem SNR estimator is not a test instrument ... it makes a few assumptions that will break at times.
BTW I've generated some curves https://github.com/drowe67/codec2/blob/master/README_data.md#snr-estimation-and-clipping recently, do the FreeDATA work
— Reply to this email directly, view it on GitHub https://github.com/drowe67/freedv-gui/issues/317#issuecomment-1365491888, or unsubscribe https://github.com/notifications/unsubscribe-auth/APY6YIWUJQI236HFOBAP2R3WPIMUDANCNFSM6AAAAAATBTP4ZQ. You are receiving this because you were mentioned.Message ID: @.***>
The SNR estimator works by measuring the mean and variance of the scatter diagram dots. How big the dot/blob is (N) compared to it's distance from the origin (S):
ctest -V -R test_OFDM_modem_esno_est
Here are the key points around compression for me, which I interpret by looking at BER versus SNR curves:
Some blog posts:
https://www.rowetel.com/?p=7382 https://www.rowetel.com/?p=7596
@tmiw - just confirming DPSK has been removed form freedv-gui options? I can't see it on the Options menu. IIRC we spoke about this a few weeks ago. If so I might start to remove codec2 support for it.
Also we probably don't need the "700D manual unsync" - IIRC that was due to sync issues in the very early days of 700D.
Yep, DPSK's been removed. I'll go ahead and remove manual unsync too.
The first blogpost by @drowe67 (https://www.rowetel.com/?p=7382) s extremely well done and raises the point while answering all my complaints and comments. Thank you also for the explaation about how the freedv gui SNR display gets computed.
I'll try to explain my concerns once more against clipping-as-default, and they have a lot to do with how we accustomed ourselves to FreeDV usage.
Before I got my fellow hams on air, we exercised getting the audio
levels right, the equalizer set properly so that a one's own speech,
encoded, recorded, played back, decoded and listened to sounds
optimum. We realized this is tricky and very speaker-dependent. My
fellow Andreas (DL1TT) can work without touching the equalizer, me
Andreas (DM4AB) needing serious tuning (damping frequencies below
150Hz, amplification to the highs) to have Codec2 encode well. Maybe
it's also that German phonems don't code as good as english ones. We
have tried ourselves speaking German and English, and it seems to
make some difference, too. But let's put this aside for now.
The above optimization of Headset-Mic input, finding a good headset
or microphone, and getting the equalizer settings tuned well takes
some time (for some hams like me, maybe not for all). Anyhow, while
you do so you probably expect to see "SNR=max" all the time as there
are no interferences or noise whatsoever.
It took us a day to find the problem with one ham who never got his
SNR to max out, reason being that he had turned clipping=on, which
was not obvious to us a few months ago, and you might argue is not
obvious to any average ham user without explanation.
I understand the need for better performance. It's true and valid.
Probably that SNR display on the GUI would then need a re-design. The use of the term "SNR" isn't optimum anyway.
Thanks a lot for the exchange - more things learnt again!
Am 27.12.2022 um 00:46 schrieb drowe67:
The SNR estimator works by measuring the mean and variance of the scatter diagram dots. How big the dot/blob is (N) compared to it's distance from the origin (S):
|ctest -V -R test_OFDM_modem_esno_est |
Here are the key points around compression for me, which I interpret by looking at BER versus SNR curves:
- The compression does degrade the scatter diagram, so you lose about 1dB on BER versus SNR, so the BER indeed increases for a given SNR.
- However the actual S increases by about 5dB, so the receiver sees 5-1=4dB increased SNR for the same SSB Tx peak power. BER is dramatically reduced, and speech quality over a given channel improves - which is our key requirement.
Some blog posts:
https://www.rowetel.com/?p=7382 https://www.rowetel.com/?p=7596
— Reply to this email directly, view it on GitHub https://github.com/drowe67/freedv-gui/issues/317#issuecomment-1365510328, or unsubscribe https://github.com/notifications/unsubscribe-auth/APY6YITYF32MYXJ5YNUOHMLWPIU5ZANCNFSM6AAAAAATBTP4ZQ. You are receiving this because you were mentioned.Message ID: @.***>
Hello @drowe67.
Thanks a lot for epxlaining. I see things clearer now. Esp. the first post is very well done!
Maybe the GUI SNR display could be re-thought then. The BER error correction rate is more meaningful then isn't it? And maybe looking at the spectrum center (where the FreeDV signal should be) and the upper and lower boundaries (where background noise could be) would make more sense to generate a physical SNR estimate display then?
Cheers, Andreas.
Am 27.12.2022 um 00:46 schrieb drowe67:
The SNR estimator works by measuring the mean and variance of the scatter diagram dots. How big the dot/blob is (N) compared to it's distance from the origin (S):
|ctest -V -R test_OFDM_modem_esno_est |
Here are the key points around compression for me, which I interpret by looking at BER versus SNR curves:
- The compression does degrade the scatter diagram, so you lose about 1dB on BER versus SNR, so the BER indeed increases for a given SNR.
- However the actual S increases by about 5dB, so the receiver sees 5-1=4dB increased SNR for the same SSB Tx peak power. BER is dramatically reduced, and speech quality over a given channel improves - which is our key requirement.
Some blog posts:
https://www.rowetel.com/?p=7382 https://www.rowetel.com/?p=7596
— Reply to this email directly, view it on GitHub https://github.com/drowe67/freedv-gui/issues/317#issuecomment-1365510328, or unsubscribe https://github.com/notifications/unsubscribe-auth/APY6YITYF32MYXJ5YNUOHMLWPIU5ZANCNFSM6AAAAAATBTP4ZQ. You are receiving this because you were mentioned.Message ID: @.***>
Perhaps the key requirements here is how Hams can determine they have configured their FreeDV station correctly and have a good link to another station?
One idea: aim for 8dB SNR or 0% coded BER. More SNR won't make much difference once the coded BER is 0 - it's a threshold effect like FM. As we have seen - even quite harsh distortion like clipping doesn't affect the BER v SNR performance much (about 1dB). So maybe "max SNR" isn't the correct goal - rather hitting a threshold is enough.
Might be good to have clipping on by default, and hide the option to turn it off (making it available to advanced users), to avoid confusion between stations and maximise performance.
Re the core codec speech quality I'm working on a Codec 2 mode that is less sensitive to microphone frequency response, so it gives more or less uniform quality across different configurations. Also improved speech quality. When the prototype matures we can try it on other languages. I'd also like to collect samples from say 10 different Hams, to understand how microphone frequency responses vary between stations.
I'm also interested in another lap around the modem waveforms. There are ways to minimise PAPR without clipping, like adjusting the FEC codewords so that the modulated symbols have a low PAPR. Or sorting the speech vector quantiser codewords so high PAPR vectors have a very low probability (e.g. happen once every 5 seconds) and can be clipped with no one noticing.
The Options -> Modem dialog suggests that "Clipping" and "700D TX Band Pass Filter" are "FreeDV 700 Options". During our last FreeDV technical meeting we all seemed to perceive that clipping is active at least for all OFDM modes, isn't it?
From studying the code it seems that regardless what the default settings in CODEC2 are, they are overridden by run-time settings as configured in FreeDV, like here: freedvInterface.setRunTimeOptions( (int)wxGetApp().m_FreeDV700txClip, (int)wxGetApp().m_FreeDV700txBPF); (code from main.cpp, lines 1430ff)
Our suggestion would be to adjust the Options -> Modem dialog, provided our observation and the code study are not wrong?