jks-prv / Beagle_SDR_GPS

KiwiSDR: BeagleBone web-accessible shortwave receiver and software-defined GPS (archived)
http://kiwisdr.com
484 stars 161 forks source link

Individual S-meter Calibration to Signal S9=-73dBm (if attenuators/pre-amps are used) #55

Closed ON5KQ closed 8 years ago

ON5KQ commented 8 years ago

It would be nice, if the user could adjust the S-meter to always show correct levels being -73dbm = S9. I found that a real input from a calibrated R&S source only shows S7-8 on the S-meter. I also found it necessary to use additional 15db attenuation, to overcome overload with Megawatt broadcasters in the evening. So in my case a real S9 from the antenna becomes roughly S5 on the display.

I think the S-meter should always show S9 for a signal which comes down the antenna cable with -73dbm, no matter if the signal has been attenuated as part of the rx-hardware for linearity reasons of the rx or not. The received signal itself has not changed - so the signal strength indicator should not change either.

May be we should show a green botton next to the s-meter, if the user has followed a calibration routine, so others can see, that his SDR represents S9=73dBm or not.

Currently the KiwiSDR S-meter does NOT show S9 with direct connection to a -73dBm source - at least not with my unit.

Ulli, ON5KQ

P.S. Currently testing my system in unfortunately very noisy location - if I can manage the noise, I will bring it online. But if it is of no added value at the current location I have to find a better location first before it will appear at SDR.hu. Thanks for your understanding

jks-prv commented 8 years ago

Hi Ulli. Sure, S-meter calibration is always a problem when an antenna system of unknown gain is used (or with inline preamp/attenuation as you point out). The current S-meter calibration factor is a constant in the code. There is an open feature request to make it adjustable on the admin page. A long time ago I set the calibration, like you, based on input from an hp 8657b signal generator (and also checked the response linearity). But I have not checked it since and it may be off now as a result of some other change. Let me try and get the S-meter adjustment feature into the next release.

There is also the issue of the waterfall dBFS/dBm display accuracy which is directly related but adjusted in a different way (FFT output gain).

ON5KQ commented 8 years ago

John, many thanks to your excellent software support of this project - it is essential, but I also would like to thank you for doing it...

One suggestion to the S-meter issue:

The reason for a precise S-meter is, that kiwiSDR is the only affordable remote spectrum analyser with 30Mhz bandwidth in one screen, which I know of... NOT ONLY it is a receiver to listen to a program, but it can also be a receiver to measure (if you have a calibrated antenna..... which can be constructed!). Amateurs more and more need to focus on noiselevels rather than antenna gain, for a good system performance.... So to judge noiselevels correctly is very important for overall succes.

jks-prv commented 8 years ago

I think I see what you mean: 0 dBFS means full scale input to the ADC, i.e. +/-8k for our 14-bit ADC. So even if you're adjusting the cal to get correct dBm readings on the waterfall and spectrum you'd still have an indication of how far from ADC saturation your input is.

jks-prv commented 8 years ago

The v1.25 release about to go out has separate S-meter and waterfall/spectrum cal fields on the admin > config page. These values simply get added as an "offset" to the values shown by these UI elements. They are separate because I found that the defaults needed to be different and it also allows for separate tinkering.

The S-meter is of course derived from the audio channel and it was very simple to find the correct default cal value (-13 dB) by feeding a signal generator directly to the Kiwi. The S-meter seems to show a linear response from my generator setup noise floor S3, -109 dBm to S9+60, -13 dBm. I'd be curious what other people find.

The waterfall/spectrum case is a little more complicated because of the FFT involved. The code already has some scaling and offset correction to the FFT output based on the full-scale range of the FFT inputs. I don't quite understand all the subtle issues with this. For example the spectrum level from the generator single-tone varies a bit over the zoom ranges. I think this has to do with the fact that there are more FFT bins than display pixels and you have to decide how to deal with that. Right now I just take the peak value from the multiple bins, but I could see an argument for taking the sum or even an average. The CuteSDR code has this very complicated way of determining the FFT scale factor based on all these issues. I still don't fully understand how it works.

ON5KQ commented 8 years ago

Hi John, Many thanks for continious efforts in software development - great support!

With latest software update (Vers 1.25) I have a question:

Where can I really see the remaining headroom/dynamic range in other words the difference of actual signal level (which has nothing to do with read-out calibration) and the max input signal before clipping (arround 0dbFS) If I cannot see that difference in the spectrum display:

Thanks for clarification. Ulli, ON5KQ

jks-prv commented 8 years ago

Hi Ulli, Are you sure about that? I can park on 19.8 kHz, home of Australian VLF powerhouse NWC at about -90 dBm, and then add +40 dB to the S-meter cal config setting. The S_meter jumps way up but the spectrum level stays the same, -90 dBm. Data for the S-meter comes from the audio DDC. For the spectrum it comes from the waterfall DDC. So one shouldn't effect the other and that's why there are separate cal adjustments for each.

As we discussed before I need to add a UI setting that allows you to switch the spectrum scale from reading in "dBm" that is the same value as the waterfall (hence taking the waterfall cal adjustment into consideration) and a "dBFS" scale where 0 dBFS is the max positive ADC value (unaffected by the waterfall cal value).

The ADC does provide an "overflow" signal. I already have Verilog firmware to bring this signal into an FPGA status register that can be read by the Beagle C code. This could be very helpful in detecting overload conditions. So the question now is how to best represent the signal on the UI. I'm thinking maybe a flashing red dot on the S-meter scale. The signal will have to have some persistence added since it is updated by the ADC for each sample (i.e. at 66.7 MHz). So need some way to distinguish the occasional overload conditions, caused by the "bad luck" of the sum of all the receiver signals adding up, to overflow from a chronic overload caused by a few signals exceeding the ADC input range.

jks-prv commented 8 years ago

Okay, the v1.27 release that just went out has a red "OV" indicator at the upper right of the S-meter that is wired to the ADC overflow pin in a simple way. OV gets lit going from -15 to -14 dBm on my external signal generator. -13 dBm is of course S9+60. In a few more hours it will be early evening in Europe and it will be interesting to see how the OV indicator reacts to those Kiwis that have obvious overload problems (I couldn't find any now at 0200Z).

Here are some screenshots:

screen shot 2016-11-26 at 12 49 45 pm

Noise floor with sig gen connected but no output enabled.

screen shot 2016-11-26 at 12 48 58 pm

Sig gen at -15 dBm. Generator 20 MHz second harmonic at -50 dBm.

screen shot 2016-11-26 at 3 10 09 pm

Sig gen at -14 dBm. Spurious responses as a result of ADC overload.

jks-prv commented 8 years ago

It's 0500Z now and I found that http://rz3dvp.ru:8073/ shows ADC overload because of a MW BCB signal on 612 kHz that is above S9+60 (the S-meter graph shows -10 dBm). Surprisingly the rest of the spectrum looks okay.