Open antonioalbanesedev opened 1 year ago
I believe that the tx gain setting on the bladerf 2.0 cards doesn't do anything. You'll have to turn down the digital inputs that you're sending to the card. If you're going through libbladeRF on the bladeRF2 cards then the tx_gain setting is effective.
Hi Paul, I just tried by modifying this line in ofdm-transfer, which changes the amplitude of the samples. Setting it to a lower value, it brings down the overall spectrum but doesn't change its shape.
I am using a spectrum analyzer with a higher noise floor, so I eventually don't see the skirts for values < 0.15 but they come up again as soon as I increase it. Please mind that when I start seeing the skirts, my marker power has already decreased by 20dB compared to the initial test. Still bad.
I read that OFDM has a high peak to average power ratio, so is it possible that you aren't attenuating by quite enough? Can you look at your input samples and see if they're saturating, or look at your output RF on a fast scope and see if the waveform is clipping?
Sure, I will do this test later this week. In the meantime, I did a quick comparison between bladeRF 2.0 xA4 and USRP B210. They share the same radio front-end.
Here is the result. I can repeat these measurements with the better SA but you get the idea.
bladeRF:
USRP:
I see no spikes in the spectrum produced by the USRP with the same configuration (I even had to raise the gain by 20 points on the USRP to have a compatible spectrum).
What are you using for your reference oscillator? I've noticed a difference using a 10 MHz reference from an OCXO.
I made a mistake earlier about tx gain. If you are using libbladeRF then the tx gain you set there is important. The tx gain seems to directly multiply the input samples. If your input samples have an amplitude of 1.0 then a gain of 66 60 dB will result in an RF output of about 0 dBm.
I use libbladeRF for my tests but I have tried soapy too with the same results. I will try to get my hands on OCXO and test again to see if that makes any difference for me.
Disclaimer: I don't know the blade 2.0 at all. But it looks like aliasing in the AD9361 upconverter chain. Did you ensure that the filter chain configured in the AD9361 matches the requirements for this waveform? ADI has got a Matlab-based filter design tool for this exact purpose. The factory examples happen to be exactly filters for standard LTE channel bandwidths.
Hi!
I am experimenting with OFDM on a bladeRF 2.0 xA5 and a 2.0 xA4.
This is a 20MHz LTE signal at 770 MHz with maximum tx gain.
As you can see, the skirts are pretty bad. The situation does not improve changing the tx gain.
Here is one example with a 10MHz LTE signal at the same carrier frequency.
I have repeated the measurements across the whole cellular spectrum from 700 MHz up to 2.65 GHz. The out-of-band emissions are very similar, with two strong spikes in the first adjacent channel.
This would prevent the use of the radio in situations with more stringent requirements in terms of electromagnetic compatibility.
Is this behavior due to a hardware limitation? If not, can it be solved with a new FPGA image?
Thank you!
You may reproduce this issue by means of either srsRAN or ofdm-transfer .