Closed lhms1234 closed 9 months ago
Hey @lhms1234 - could you verify the RU is actually emitting a signal? If so, does the spectrum look fine?
Hi @andrepuschmann. Sure. Yesterday I've checked it doesn't emitting any signal.
Looking at the config again it I suggest you set only on UL antenna and 4 DL antennas, i.e.
nof_antennas_dl: 4
nof_antennas_ul: 1
Also maybe try to play with the iq_scaling
param and set it to 0.6
. Those are just guess, I need to assume that you're cabling, etc is correct and that the config values, etc. match the RU/DU. Please double check this as well. Can you provide a PCAP of the OFH traffic?
Has this HW setup been tested before with another DU and is confirmed to be working fine?
About the antenna configuration, I've configured it this way because of 2 fields of the RRHconfig_xran.xml file (RU's configuration file), which are "RRH_EN_EAXC_ID" and "RRH_TRX_EN_BIT_MASK". They are correlated as follows:
<!-- RRH_EN_EAXC_ID: Enable using eAxC ID field defined in O-RAN spec. -->
<!-- When 0 is set, RU port ID=0,1,2,3 are used for PDSCH/PUSCH if RRH_TRX_EN_BIT_MASK = 0x0F -->
<!-- When 0 is set, RU port ID=4,5,6,7 are used for PRACH if RRH_TRX_EN_BIT_MASK = 0x0F -->
<!-- When 0 is set, RU port ID=0,1 are used for PDSCH/PUSCH if RRH_TRX_EN_BIT_MASK = 0x03 -->
<!-- When 0 is set, RU port ID=2,3 are used for PRACH if RRH_TRX_EN_BIT_MASK = 0x03 -->
RRH_EN_EAXC_ID = 0
Thanks for the tip, here is the pcap. Only the Core and the PTP Grandmaster were successfully tested before with another RAN stack. The machine that hosts srsRAN CU/DU wasn't tested this way before (with PTP Grandmaster and RU). After changing iq_scaling
many times, nothing changes. I've noted on the Wireshark that all the U-Plane packets are malformed.
Hi @andrepuschmann. After some tests, it was discovered that the network card used in the machine described above apparently doesn't support PTP. But the problem found happens before this. Testing srsRAN in another setup (with another RU of the same model), that was completely tested and works well with another RAN stack, the same error has occurred: no signal was emitted. When the other stack is exercised without PTP, at least the SIBs are detected. Could you provide the RU configuration file (RRHconfig_xran.xml) used on your integration tests with rpqn4800e RU, please?
Hi @lhms1234 , I think you should use different window parameters and iq_scaling, can you try with this config?
ru_ofh:
ru_bandwidth_MHz: 100
t1a_max_cp_dl: 470
t1a_min_cp_dl: 258
t1a_max_cp_ul: 429
t1a_min_cp_ul: 285
t1a_max_up: 196
t1a_min_up: 50
is_prach_cp_enabled: true
is_dl_broadcast_enabled: false
ignore_ecpri_payload_size: true
compr_method_ul: bfp
compr_bitwidth_ul: 9
compr_method_dl: bfp
compr_bitwidth_dl: 9
compr_method_prach: bfp
compr_bitwidth_prach: 9
enable_ul_static_compr_hdr: false
enable_dl_static_compr_hdr: false
iq_scaling: 1.6
enable_dl_parallelization: true
cells:
- network_interface: eno8603np3
ru_mac_addr: 6c:ad:ad:00:08:c4
du_mac_addr: b4:45:06:ec:7f:b2
vlan_tag: 2
prach_port_id: [4, 5, 6, 7]
dl_port_id: [0, 1, 2, 3]
ul_port_id: [0, 1, 2, 3]
cell_cfg:
dl_arfcn: 640000 # ARFCN of the downlink carrier (center frequency).
band: 78 # The NR band.
channel_bandwidth_MHz: 100 # Bandwith in MHz. Number of PRBs will be automatically derived.
common_scs: 30 # Subcarrier spacing in kHz used for data.
plmn: "00101" # PLMN broadcasted by the gNB.
tac: 7 # Tracking area code (needs to match the core configuration).
pci: 1
nof_antennas_dl: 4
nof_antennas_ul: 1
prach:
prach_config_index: 159
prach_root_sequence_index: 1
zero_correlation_zone: 0
prach_frequency_start: 12
pdsch:
mcs_table: qam256
tdd_ul_dl_cfg:
dl_ul_tx_period: 10
nof_dl_slots: 7
nof_dl_symbols: 6
nof_ul_slots: 2
nof_ul_symbols: 0
expert_phy:
nof_ul_threads: 4
nof_pdsch_threads: 4
nof_dl_threads: 4
max_proc_delay: 4
Hi @ismagom,
Sorry about the delay in replying. I was testing your configuration and found many synchronization problems on my setup. Now I'm able to connect to the network and generate traffic with a COTS UE (Tablet Samsung Galaxy Tab S8 Plus), but I've noticed three problems:
/root/melao/srsRAN_Project/include/srsran/support/async/async_event_source.h:45: srsran::async_event_source<T>::~async_event_source() [with T = mpark::variant<asn1::elementary_procedure_option<asn1::f1ap::ue_context_release_complete_ies_container>, srsran::no_fail_response_path, srsran::transaction_timeout>]:
Assertion `not has_subscriber()' failed - Observers must not outlive event sources.
Aborted
Could this be a gain calibration problem? I've noted that RSRP and RSRQ are okay (-80 dBm and -11 dB, respectively).
Thanks for the feedback @lhms1234 , could you attach logs please?
Sure, here you are:
Can you try the following:
1) Add these options to the config file:
expert_phy:
nof_ul_threads: 3
nof_pdsch_threads: 4
nof_dl_threads: 2
max_proc_delay: 5
2) Generate DL UDP traffic from the machine running the core (instead of TCP)
and send again gnb.log and gnb_terminal.log please. Thanks
You still have many RT faults. It's not easy to tune a system to do 100 MHz 4x4 MIMO. Maybe you can start with lower BW and number of antennas?
Hi @ismagom. Sorry for the late response. After many tests using an 80 MHz BW with MIMO 2x1, I've reached a throughput of ~200 Mbps DL and 25 Mbps UL. On my tests with MIMO 4x4, the throughput was considerably lower (both DL and UL), but MIMO 2x1 was more stable. On my tests with 100 MHz BW (including with MIMO 2x1), the DL throughput was close to 25 Mbps and 1-5 Mbps for UL. I'm attaching the logs and the configuration file of a test with MIMO 2x1 and 80 MHz BW here . I really appreciate it if you could guide me to make some configuration optimizations (on the srsRAN side) to obtain a better performance using MIMO 4x4 or, at least, a 100 MHz BW.
Hi @lhms1234, can you try if this branch works better? For this to work you need to comment out the entire expert_phy
section. I would also recommend commenting the parameters min_ue_mcs
for pdsch
and pusch
Hi @lhms1234 , can you give us an update on the status of this issue, please?
Hi @ismagom. Of course. Unfortunately, I've made no more progress on my tests for some setup sharing issues. After a NIC that we bought arrives, I'll prepare a new setup and resume the tests.
Hey @lhms1234 - could you perhaps share the FW version running on your RPQN-7801E? We are currently trying a RPQN-7801I and experience issues with unusually high BLER as well. With DPD enabled in the FW it doesn't even work at all at bandwidth larger than 20 MHz.
Hi @andrepuschmann. Sure, here you are:
root@arria10:~/test# cat /home/root/test/version.txt
b_branch: master
b_commit: ffa6dd30bdfd581dab5b3ac31e1933533d22bf50
s_commit: aa0fbb0efca175b6d281398d9bd0f03423bfb693
tag: v3.1.12q.551
build_time: 202304071650
Hey @lhms1234 - your tests with 80 MHz 2x1 look better than our tests actually. But there definitely is an RF issue as well as the 20% BLER isn't normal. At least with our RQPN-4800 we see very little BLER unless we are far away of course or have other impairments.
Have you changed anything in the RU config? Is the one posted in the initial post still the same? Specifically, have you still RRH_RF_GENERAL_CTRL = 0x3, 0x1, 0x0, 0x0
this in your config?
When you test with lower bandwidth, do you change the RRH_MAX_PRB = 273
setting? Or you just leave it there and start with e.g. 80 MHz in the srsRAN config?
Hi @lhms1234,
Can you send gnb.log and console trace again in the MIMO 2x1 configuration?
Also, make sure you check out the latest code. From your config above it looks you are using one before 23.10.
I would also suggest to not use min_ue_mcs or max_ue_mcs please.
Thanks
Hi @ismagom.
Maybe I made a mistake, it's been some time since my last test, as I mentioned previously. Yesterday the NIC we bought arrived, I'm going to resume testing with the newest version of srsRAN as soon as possible.
Is there any update on this issue?
Hello, @ismagom. Yes, I have some good news. Using a newer repository version on the new setup made it possible to achieve better throughputs. I've noticed the following mean and peak values:
The commit I compiled is "55c984b55736d0dd2d2ee328f1ae8d9de97e3e19", with these configurations of RU and gNB. The setup (considering srsRAN version) proved to be much more stable and more performatic with MIMO 4x4, SCS 30 KHz and 100MHz BW on n78 band (using Foxconn RPQN 7801E).
Thanks for the report. There is certainly room to improve the performance, if you are interested in proceeding we can certainly help. Please create a new Discussion post if that's the case.
Thanks for all the feedback
Issue Description
I'm testing srsRAN's split 7.2 feature with a Foxconn RPQN 7801E RU. Following the tutorial, I was able to test the setup with successfully synchronization of RU and CU/DU, however, I couldn't detect any SIB message using Amarisoft UE. I was wondering if it could be some configuration issue, I hope someone could give some help at this point.
Setup Details
Expected Behavior
Amarisoft UE can detect and connect to the network and make some traffic tests (by now, just a simple ping).
Actual Behaviour
Amarisoft UE wasn't able to detect the network when RU was synchronized and the srsRAB CU/DU was running.
ptp4l output:
phc2sys output:
srsRAN output:
srsRAN log (snippet repeated several times):
Steps to reproduce the problem
Additional Information
Here are the configuration files used: RRHconfig_xran.xml (RU configuration), default.cfg (linuxptp configuration), and gnb_ru_rpqn4800e_tdd_n78_20mhz.yml (srsRAN configuration).