f4exb / sdrangel

SDR Rx/Tx software for Airspy, Airspy HF+, BladeRF, HackRF, LimeSDR, PlutoSDR, RTL-SDR, SDRplay and FunCube
GNU General Public License v3.0
2.73k stars 421 forks source link

Spectrum calibration #1120

Closed f4exb closed 2 years ago

f4exb commented 2 years ago

Is your feature request related to a problem? Please describe. One would like to see units in dBm on the spectrum display which is not the case today

Describe the solution you'd like As previously described in the comments of #768 and to keep it relatively simple:

Describe alternatives you've considered N/A something new is needed

Additional context New button with ruler icon: image Calibration points dialog: image

srcejon commented 2 years ago

Obviously, the first implementation doesn't need to do everything, but one point that comes to mind is how to link this up with device settings other than frequency, such as gain.

Some SDRs have reasonably linear gain settings, so this could perhaps just be handled with a getGain() function in the DeviceSampleSource API, that is then used to scale the value. But perhaps the calibration table would need to be 2D.

f4exb commented 2 years ago

There is no connection between the spectrum component and the device settings and there should not be.

Firstly the spectrum component can be used in many places: devices, channels, features therefore it should be autonomous vs the device. However it does know its center frequency and sample rate or span.

Secondly the gain cannot be made a generic setting through virtual getter and setter methods. There are in fact many gains implemented or not in a device: LNA, Mixer, VGA... So in the end this would be rewriting something like SoapySDR. Conversely the center frequency is generic enough to deserve a virtual getter and setter. The only other parameters that are generic enough are the sample rate and decimation/interpolation. I found them missing when implementing the control of the remote device from the remote input component through the remote sink of the remote instance. However I gave it up since setting the sample rate from the remote input is not essential.

Again there is a confusion between calibration and equalization and what you are asking for is equalization. Calibration just makes it possible to show levels as real power levels instead of relative. This is an artifact of the spectrum component and not anything more. In order to do so a correspondence between a relative level and absolute level has to be made. It could be unique ignoring the frequency dimension however in practice one would make a calibration with a reference generator at several spots in frequency much wider than the spectrum span. Then it is interesting to add the frequency dimension to the calibration and center frequency is directly available in the spectrum component.

Correcting the gain vs frequency is essentially equalization and this has to be implemented differently. This is not an artifact we want to change the amplitude of the I/Q signal for anything that will sit behind. This can be implemented as an equalizer channel plugin that can be internally connected to a local input device. This is like the local sink / local input plugin but with gain and without decimation. Initially I thought of the "equalizer" plugin based on the equalization of the noise floor assuming that a dummy load is connected at the input (thus only taking into account the internals of the system). This equalization is made over the spectrum span. However a constant gain component could be added and similarly to the calibration points the "gain" points could cover a much wider range than the spectrum.

Here I am referring to your comment in #768: "For radio astronomy (and other small signal measurements), it might be nice to be able to support calibration to flatten out the non-flat frequency response of some RF front-ends." In fact this is equalization not calibration because what you want to do is to insert a filter with variable gain vs frequency i.e. an equalizer.

In fact the "equalizer" could have a constant gain derived from the wideband gain adjustment points and a local gain correction based on the response with a dummy load connected. One or the other could be used or both at the same time.

srcejon commented 2 years ago

Let me try to clarify - the comment above wasn't referring to that prior comment (that's implemented now in the Radio Astronomy plugin).

All I meant here was to support adjusting the reported power value by the Device gain setting. E.g. you calibrate as you have mentioned and get a power figure of X dBm. If you adjust the Device gain setting, the reported figure will change. But what you typically want as a user, is the input signal power independent of this gain. The Spectrum could query the device to determine what that gain is, so it can account for it.

Yes, setting gain for all the different SDRs isn't simple for the reasons you say, but all that's needed here is a getter, to allow the device to report the total gain.

The connection between the Spectrum and Device would only be made where appropriate.

f4exb commented 2 years ago

The connection between the Spectrum and Device would only be made where appropriate.

This is bad design. The spectrum display component should not have this sort of backward connectivity to the originator of its input stream. It is a terminal component that cannot assume anything about the originator of the stream. It does have forward connectivity to set its center frequency and span and it could also get a scaling factor. However I am not sure this is desirable. It is expected that when you change the gain the spectrum moves accordingly.

but all that's needed here is a getter,

Well... to get what? How do you figure out the overall gain? Is it the sum (dB) or product (linear) of all gains? And if you have a getter then eventually you will have the requirement to have a setter. How do you manage the gain setting then? I think this is not practical anyway.

The requirement to see dBm instead of dB is for instrumentation and also for experts. When you do instrumentation you know you have to take into consideration the gain of your input chain and that the reading will change whenever you change that input chain gain from what it was set to when doing the calibration step.

f4exb commented 2 years ago

I think this is not practical anyway.

Let alone the AGC case ...

srcejon commented 2 years ago

The spectrum display component should not have this sort of backward connectivity to the originator of its input stream. It is a terminal component that cannot assume anything about the originator of the stream. It does have forward connectivity to set its center frequency and span and it could also get a scaling factor.

Ok, that makes sense - I hadn't looked at the code in detail.

It is expected that when you change the gain the spectrum moves accordingly.

I mentioned this, because every spectrum analyzer I've used automatically adjusts the measured power to account for the gain settings used. There's also usually a field that you can enter the gain for an external amplifier.

Well... to get what? How do you figure out the overall gain? Is it the sum (dB) or product (linear) of all gains?

Well, there's an even more basic question. While Lime, USRP + SDRplay provide some typical gain measurements in their datasheets, for some other SDRs it seems the gain is a bit of an unknown. I think this is something SDRangel can help users better understand - I was planning to write another plugin like the Noise Figure plugin that would allow automated measurement of this. It may also be able to give us an idea of the compression point as well, which could be useful to display.

And if you have a getter then eventually you will have the requirement to have a setter. How do you manage the gain setting then?

I'm not proposing any changes to how the gain is set. But if you were to do it, off the top of my head, I'd say max gain should be applied to the first stage, as the Friis formula shows that will result in the highest SNR. Certainly you wouldn't want to remove the option of setting it manually.

The requirement to see dBm instead of dB is for instrumentation and also for experts. When you do instrumentation you know you have to take into consideration the gain of your input chain and that the reading will change whenever you change that input chain gain from what it was set to when doing the calibration step.

For me, s/w is about making things easier though. If you can work it out by hand, it should be possible to automate in s/w making it easier in the future, and everyone can benefit from it.

Of course, we're doing this for fun - I don't expect you to agree with me on everything! I'm just throwing ideas out there.

Let alone the AGC case ...

The AGC case is perhaps an example of why the Spectrum should have some additional knowledge about the device settings. When AGC is enabled, it probably shouldn't try to display a figure in dBm, as it will be unreliable. So the scale factor passed to it could indicate that.

f4exb commented 2 years ago

Well, there's an even more basic question. While Lime, USRP + SDRplay provide some typical gain measurements in their datasheets, for some other SDRs it seems the gain is a bit of an unknown.

Yes, certainly. And there are other factors than just the gain. It is quite possible that the sample rate has an influence for some (or all?) devices. This makes it pretty vain to rely on any parameter from the device settings to derive some gain correction. Therefore the simplest approach (not so simple already) is to store a calibration chart for a certain configuration of the device. Note that you could also have some external LNA that you cannot automatically account for. The calibration chart is saved in the spectrum settings that are persistent in the preset so by loading a preset you would load the device settings and the calibration chart altogether.

The frequency is different since we all know that gain depends on frequency and frequency is unambiguous and also readily available. Also you may want to change frequency over a wide range without re-running a calibration cycle again. A linear interpolation is assumed between calibration points and this may be wrong. Anyway if more precision is required then more points can be entered.

For me, s/w is about making things easier though. If you can work it out by hand, it should be possible to automate in s/w making it easier in the future,

For something like automatic calibration using VISA or similar I would implement this as a feature plugin.

f4exb commented 2 years ago

There is a slight change concerning units. The point is calibration and one cannot assume calibration against what. Most of the time this would be derived from a measurement in dBm or mW but this is not strictly necessary one may assume W or dBW instead (particularly in case of transmitters) and in fact it could be anything the user assumes. Therefore calibrated value will stay unitless dB and we'll be talking about a "calibrated" value not "absolute" value.

f4exb commented 2 years ago

Deployed in v6.19.0