rtlsdrblog / rtl-sdr-blog

Modified Osmocom drivers with enhancements for RTL-SDR Blog V3 and V4 units.
https://www.rtl-sdr.com
GNU General Public License v2.0
274 stars 41 forks source link

wideband noise artefact above 1GHz in current HEAD #51

Open f00b4r0 opened 3 months ago

f00b4r0 commented 3 months ago

Hi,

Having just received my V4 (bought from official store), I installed the driver (following v4 setup guide) and the dongle, for ADSB tracking. Checkout HEAD is 065e3d1a116f24c78622fd21c6c0d6efec35060b, installed version (using dpkg-buildpackage recipe) is 0.7.0~rtlsdrblog1.

I immediately noticed degraded performance and higher noise floor reported by dump1090 compared to my previous NESDR Mini2+ (RT820T2-based), so I ran a rtl_power sweep: rtl_power -f 800M:1200M:100k -i 20 -c 50% -e 10m -F 9. This showed very strange wideband noise starting at exactly 1GHz:

rtlsdrv4

I then plugged by Mini2+ back, using the same driver, and while the performance was slightly better, it was nowhere near what it used to be. Again, ran a sweep:

nesdr

Same oddity. I then reinstalled the system-provided rtl-sdr driver (Debian bookworm 0.6.0-4) and ran another sweep on the Mini2+, lo and behold:

nesdr-060

wideband noise is gone and performance is back to optimal.

So I went back and checked out the first commit that supports the V4 (1261fbb) and built and installed the resulting driver, and ran a final sweep back on the V4:

rtlsdrv4-1261fbb

The noise is gone.

Thus it seems to me the current drivers are introducing a digital artefact in the radio data.

In all honestly I didn't feel like bisecting as I'm doing these experiments on an underpowered armhf device: I hope this bug report is useful enough. Let me know if more info is needed.

f00b4r0 commented 3 months ago

Did a little more digging starting with e47685e as referenced in #46, in case this helps:

In other words, 987d1e4 is AFAICT the last good commit (at least for ADS-B use - that's what I'm using now). HTH

rtlsdrblog commented 3 months ago

What you're seeing is the intended effect of the updates. https://github.com/rtlsdrblog/rtl-sdr-blog/commit/2499c9b64d40b1b68f629a0cbe9f3b18409e079b turns on an 3dB extra gain block in the tuner chip which was off by default before. But is on by default in other driver branches like librtlsdr.

https://github.com/rtlsdrblog/rtl-sdr-blog/commit/065e3d1a116f24c78622fd21c6c0d6efec35060b increases the VGA gain when the frequency is above 1GHZ. So this will show up as what looks like a noise floor rise in wideband scans but is actually the intended behavior.

These +3dB gain block activation was added because in tests it gives a significant boost to SNR. I'm surprised that you are seeing an increase in 'bad' messages though on ADS-B. Is it possible that your input is already saturating and the extra 3dB caused it to saturate more? If that is the case, you could improve your reception by just turning down the main gain setting and ideally using a filter?

Similarly increasing the VGA gain at frequencies above 1 GHz also helps increase SNR a bit. The VGA gain might be set a bit too aggressively though, I may look at reducing that soon.

f00b4r0 commented 3 months ago

Thanks for the reply. Quick disclaimer: I know very little about SDR and barely more about radio transmission. Willing to be guided through useful A/B tests though. Please bear with me :)

What you're seeing is the intended effect of the updates.

I figured as much :) What's not obvious to me is whether these intended effects are desirable. My reference point is before/after with the following dump1090 metrics: max range, valid/bad messages rates and loud decoded/undecoded, as these are easy to observe.

2499c9b turns on an 3dB extra gain block in the tuner chip which was off by default before. But is on by default in other driver branches like librtlsdr.

Well this +3dB gain change raises the reported noise by twice as much, which seems weird. It also increases the rate of bad messages and seemingly reduces max range.

065e3d1 increases the VGA gain when the frequency is above 1GHZ. So this will show up as what looks like a noise floor rise in wideband scans but is actually the intended behavior.

This completely drowns remote signals into the noise floor though: when using dump1090-fa's adaptive gain control, the software eventually reduces the gain so much (because the noise floor is so high and doesn't reduce linearly with the gain setting) that nearly no messages are received anymore.

These +3dB gain block activation was added because in tests it gives a significant boost to SNR.

Well if a +3dB signal boost raises the noise by +6dB, haven't we lowered the SNR by 3dB?

I'm surprised that you are seeing an increase in 'bad' messages though on ADS-B. Is it possible that your input is already saturating and the extra 3dB caused it to saturate more? If that is the case, you could improve your reception by just turning down the main gain setting and ideally using a filter?

Bit of context: the receiver is located in a so-called "dead zone" countryside: there is virtually no radio coverage there: almost no FM, barely any GSM (as visible on the provided sweeps, which the antenna probably catches because it's high enough - phones on ground level don't get signal), no useable terrestrial TV: it is very quiet. The "loudest" radio pollution would be the 2.4GHz outdoor wifi access points, one of which is close to the antenna mount (but would they matter at that frequency?). Furthermore, the outdoor antenna is connected to the receiver via 10m LMR195 cable, so there's attenuation there already. I know this is not ideal, but it worked before the above commits.

In usual operation (up to 987d1e4 and similarly with libsdr-0.6.0-4/NESDR Mini2+), the gain is set to max (58.6) adaptive dynamic is off (it stays at 58.6 when on, so saving CPU cycles not turning it on) and adaptive burst is enabled: most of the day, the gain stays at max, the reported noise is -30dB, the reported max signal is -10dB, the bad message rate is very stable and the max range (limited mainly by hilly terrain) is about 100NM. Occasionally a couple low flying planes will bring the number of loud decoded/undecoded high enough to kick the adaptive burst which typically falls back to the next gain index (49.6) for a couple of minutes and rarely (and not for long when it happens) to the next two (48 or 44.5) while they pass. Interestingly, the reported noise floor doesn't move (stays at -30dB) when the gain is reduced.

Similarly increasing the VGA gain at frequencies above 1 GHz also helps increase SNR a bit. The VGA gain might be set a bit too aggressively though, I may look at reducing that soon.

Given that it completely disrupts dump1090-fa's adaptive gain control and more generally kills range, I'd say something is not right there :)

I hope this helps, feel free to let me know what to test next.

rtlsdrblog commented 2 months ago

Well if a +3dB signal boost raises the noise by +6dB, haven't we lowered the SNR by 3dB?

Only the SNR matters. You can't just look at the noise floor. If it raised the noise by 6dB, but the signal by 9dB, then you have an SNR improvement of 3dB.

Not sure how dump1090-fa's adaptive gain control works, but maybe it is assuming a fixed gain setting.

I've just reverted the new VGA gain steps change, but not the +3dB gain block. Can you pull and see if there are any improvements?

f00b4r0 commented 2 months ago

Well if a +3dB signal boost raises the noise by +6dB, haven't we lowered the SNR by 3dB?

Only the SNR matters. You can't just look at the noise floor. If it raised the noise by 6dB, but the signal by 9dB, then you have an SNR improvement of 3dB.

I understand that. It's not immediately obvious to me how a +3dB gain could result in +9dB signal and +6dB noise though. It also doesn't match what is reported by dump1090.

Not sure how dump1090-fa's adaptive gain control works, but maybe it is assuming a fixed gain setting.

It's brushed over here: https://github.com/flightaware/dump1090/blob/master/README.adaptive-gain.md If I understand this well, it targets a given SNR (30dB by default) and adjusts the gain until the target is met.

I've just reverted the new VGA gain steps change, but not the +3dB gain block. Can you pull and see if there are any improvements?

I just did. To make things a little more foolproof I slightly modified my setup to have a reference point: I added an extra raspi, using my old NESDR Mini2+ dongle with bookworm rtlsdr 0.6.0-4 drivers. Both are set to maximum gain (58.6) with adaptive and burst GC disabled (so static gain setting). The connection from the antenna is:

LMR195 -> SMA 1F-to-2M tee
           +> V4 SMA Input
           +> SMA to MCX 5cm RG316 pigtail -> NESDR input

Theoretically the NESDR is at a disadvantage due to insertion losses, but that's not the point, the idea is to be able to evaluate how driver changes affect performance ratio (so that changes related to external environment, like number of planes visible, don't affect the results). I hope this makes sense, I tried to be logical about this. (BTW I've had to wrestle a bit with bias T on the V4 which seems to turn on by default when drivers are reloaded, despite not being enabled in software anywhere).

I've been monitoring data from dump1090 "local" measurements on both systems. Below is a graph showing the before after for noise and signal levels, solid lines are the V4, dashed are NESDR, with the marker at the time of the restart going from 987d1e4 to 240bd0e for the V4:

local

So it seems we've improved the SNR by 1.5dB if I'm not mistaken, but since the signal decreases I suspect we have lowered the sensitivity by as much too, right? (I don't understand why both values decrease in absolute terms despite adding more gain but as I said, I have no SDR-fu).

In parallel I was keeping an eye on the output of the web interface to dump1090 on both devices, and noticed something I don't know how to interpret: on average, the V4 gets a higher message rate, but less aircrafts (less range), and for the aircrafts common to both devices, the RSSI is always lower on the V4 (by about 5dB give or take). It got worse after the switch to HEAD (it was about -2dB before that). These values also don't align with the number reported by dump1090.

Here's an example:

Capture d’écran 2024-06-17 à 17 10 40

(left is NESDR, right is V4)

HTH, let me know if you'd like other metrics.

f00b4r0 commented 2 months ago

In case this adds some useful input, I did the following experiment: I installed current HEAD on a different receiver which is using a "bog-standard" Chinese R820T2 dongle and was previously using Debian Bookworm drivers (0.6.0-4). This resulted in a drop of both the reported noise and signal, and a drop of reported mode-S signals (as well as a related drop in CPU time spent reading data from the receiver). I naively interpret this as an overall drop in sensitivity, with less messages being "audible" and read.

local

(switch to HEAD is the rightmost blue marker - other data masked to improve graph legibility - gain is unchanged at max)

cpu
rtlsdrblog commented 2 months ago

If you instead used a fixed gain setting in dump1090-fa, does the driver change still result in a difference?

f00b4r0 commented 2 months ago

If you instead used a fixed gain setting in dump1090-fa, does the driver change still result in a difference?

Sorry for not making this clear: this is a fixed gain setting (58.6, adaptive is disabled).

hdtvspace commented 2 months ago

Bit of context: the receiver is located in a so-called "dead zone" countryside: there is virtually no radio coverage there: almost no FM, barely any GSM (as visible on the provided sweeps, which the antenna probably catches because it's high enough - phones on ground level don't get signal), no useable terrestrial TV: it is very quiet. The "loudest" radio pollution would be the 2.4GHz outdoor wifi access points, one of which is close to the antenna mount (but would they matter at that frequency?). Furthermore, the outdoor antenna is connected to the receiver via 10m LMR195 cable, so there's attenuation there already. I know this is not ideal, but it worked before the above commits.

I’m sorry, but your setup for ADS-B is really not good. You should think about changing it to PoE in an outdoor box with short coax cables or use USB extensions or put an low noise amp / pre amp at the antenna with LMR400 low loss cable. LMR195 is not for ADS-B developed.

With a thin 10 m LMR195 you have a too high signal loss with approx. 4 dB with the connectors!

More perfect ADS-B setup informations you can find here: https://www.tuxrunner.de/2022/06/29/ads-b-luftraumueberwachung-mit-raspberry-pi/

f00b4r0 commented 2 months ago

I’m sorry, but your setup for ADS-B is really not good. [...]

I'm sorry, but does anything you said relate to the software bug described here?

hdtvspace commented 2 months ago

If you instead used a fixed gain setting in dump1090-fa, does the driver change still result in a difference?

Sorry for not making this clear: this is a fixed gain setting (58.6, adaptive is disabled).

This is also a wrong gain for ADS-B. Never use more than the highest 49.6 gain settings. The RTL stick oversaturate with ADS-B and overload. Gain settings like: AUTO, -60, -58,… are for DVB-T developed not for ADS-B with aircrafts flying around with changing distance.

You need an LNA near the antenna with your too long coax like an uputronics pre amp or an wideband RTL-SDR LNA with an low loss cavity filter. And you get a better signal and more messages and more positions.

The V4 needs an higher gain as the V3. But not over 49.6. If you have many strong messages near an airport with always -1.* RSSI you should lower the gain. It blends your RTL and you get a bad reception.

I do not see a bug here with my V3 and V4 sticks at my AIS marine and ADS-B antennas.

f00b4r0 commented 2 months ago

Thank you for this wonderful off-topic lecture. Can we focus on the actual bug now?

hdtvspace commented 2 months ago

Thank you for this wonderful off-topic lecture. Can we focus on the actual bug now?

De nada. Which bug? Maybe there is an setup error in your system. Water in the connector / antenna,… many possibilities.

And sorry, the last tip for today: Do not use thin RG316 pigtails to an RTL-SDR with ADS-B for optimal performance: These thin cables are for frequencies under 1000 MHz and not for ADS-B usable. And you know it: high loss without an LNA.

f00b4r0 commented 2 months ago

nice to see trolls here.

hdtvspace commented 2 months ago

nice to see trolls here.

plonk!

hugovincent commented 4 weeks ago

Do not use thin RG316 pigtails to an RTL-SDR with ADS-B for optimal performance: These thin cables are for frequencies under 1000 MHz and not for ADS-B usable.

That's not accurate – pigtails are short and coax loss doesn't suddenly fall off a cliff at 1000 MHz, it gets gradually worse as frequency increases. You can check for yourself using any of the various coax loss calculators on the web (example); 5cm of RG316 has a loss of about 0.04dB at 1090MHz, and the SMA connectors each have an insertion loss of about 0.06dB at that frequency (assuming the impedance is well matched at each end). So the pigtail will cost well under 1dB, probably not measurable in practice without a lab-grade VNA. (However, if you meant don't use long runs of RG316 e.g. from a roof antenna into the house, then yes, I would agree).

FWIW I see what seem to be the same performance differences with a V4 for 065e3d1 and 2499c9b as the reporter (disclaimer, haven't attempted well controlled experiments); I see the maintainer has subsequently reverted 2499c9b with 83e7154.

by commented 4 weeks ago

I finally gave up here and moved over to https://osmocom.org/projects/rtl-sdr/wiki

mgomersbach commented 6 days ago

plonk

What does this even mean, can we just be civil instead of this reddit bs?