I've been encountering unexpected performance issues with my simulations. I have implemented both a MecApp and a UEApp and I am generating traffic from an UE. I've noticed that as I increase the number of resource blocks (numBands) available, I'm unexpectedly experiencing greater packet loss. While the end-to-end delay decreases(as expected), the rate of packet loss increases.
Additionally, I have observed that my SINR decreases with more resource blocks available.
Especially, I observed a fading attenuation 10 times higher when I increase from 200 RBs to 2000 RBs available accompanied by a rise in packet loss rate.
This trend seems counterintuitive to me.
I can give more details if needed.
Hello there,
I've been encountering unexpected performance issues with my simulations. I have implemented both a MecApp and a UEApp and I am generating traffic from an UE. I've noticed that as I increase the number of resource blocks (numBands) available, I'm unexpectedly experiencing greater packet loss. While the end-to-end delay decreases(as expected), the rate of packet loss increases.
Additionally, I have observed that my SINR decreases with more resource blocks available. Especially, I observed a fading attenuation 10 times higher when I increase from 200 RBs to 2000 RBs available accompanied by a rise in packet loss rate.
This trend seems counterintuitive to me. I can give more details if needed.
Kind regards Hakim