Closed jhonzy0928 closed 1 year ago
Have you checked the statistic tables to see why you are dropping OTA frames? For the TDMA model, pay attention to RxSlotStatusTable and TxSlotStatusTable.
Thank you for your reply. What I want to say is that I agree with you and do notice it, Figure 1: TxSlotStatusTable has a very low percentage of frames lost, whereas RxSlotStatusTable has a very high percentage of frames lost. Looking at the other data entries again, Figure 2-3: There is no packet loss at all in the transport layer, but SlotErrorMissed is high in the MAClayer. The problem is that these days I try to create a network model with a very low packet loss rate using the TDMA model, but no matter how much I modify the configuration schedule and related parameters, the packet loss rate is still very high, even under the default TDMA model. To sum up, even in the example model (Demo-8) the packet loss rate can reach 30%, which is unacceptable for a reliable and real-time communication network, so what else can be done to reduce the packet loss rate?
Your system does not appear to be tuned correctly to run the experiment. Looking at your screenshot, it looks like you are running in a VM. Have you configured the virtualizer to make sure the VM has enough resources? Try running emane in LXC instances directly on your system.
I did demonstrate the sample model on a VM, and when I looked up the solution to this problem, I noticed that other developers had similar problems, and your answer confirms this. Next, I will allocate more resources according to the scheme you mentioned or directly run the example on the system. If it is successful, I will tell you the exciting news. Thank you again for your reply in your busy schedule.
I'm happy to tell you that when I run EMANE directly on my server, it works and the packet loss rate is very low. In the next stage I want to change the allocation of time slots under the TDMA model. For example, I want to dynamically allocate the size of time slots in the TDMA frame structure. But I don't know if EMANE supports this function or if you can give me some ideas.
It occurred to me that if I could dynamically detect traffic congestion in the network, such as detecting the cache of traffic queues, then I could dynamically publish TDMA events based on changes in network traffic. If this idea works, how can I implement it in EMANE?
Why is it that when I use the default TDMA model to build the network topology, the connectivity between the nodes is particularly poor? When I use the ping command to check the packet loss rate between the nodes, I find that the packet loss rate between them can reach 20%-30%, which is exactly what we don't want to happen. I would like to ask how the packet loss rate of TDMA model can be reduced to below 5%.