Closed stv0g closed 2 weeks ago
We could use Ansible to automate the tests by using Ansible. In my opinion, that only makes sense as soon we want to test N > 2 sites (@mstevic maybe for the US setup).
A JSON configuration file, drag-n-drop and VILLASweb in general are not related to the network benchmarking tools.
Main open questions in my opinion:
In GitLab by @ghost on Jul 21, 2016, 11:27
In my view we should use VILLASnode for benchmarking, third party tool can be used to verify or to test TCP that is not supported by VILLASnode. Also, if we have it implemented based on VILLASnode software we can use it later on to tune sending rate for example during runtime... Also, Markus can use output of this tool to show on web interface what is current delay, packet loss rate between different nodes.
In discussions with Marija we concluded:
Instead of a dedicated villas test
utility, we should convert the tool into a new node-type.
This node-type would generate test data and send it, and compare it with the data which is received.
The new node-type has the advantage that we could easily benchmark any kind of node: network (socket), PCIe (fpga), etc
For first tests, we only look at the RTT, loss, reordering on a packet-per-packet basis. For future tests it might be interesting to study the impact on following packets as well. Usually not only a single packet is dropped or delayed. Packet loss usually ocurrs in bursts. The length (amount of packets) of such a burst is of interest.
Later on, such knowledge could be used to model the network with markow processes to predict the communication latency and loss probability.
In GitLab by @ghost on Jul 21, 2016, 12:00
We could use Ansible to automate the tests by using Ansible. In my opinion, that only makes sense as soon we want to test N > 2 sites (@mstevic maybe for the US setup).
We should have a way to test RTT and etc based on VILLASnode software as our main goal is to assess our setup, not only the network. But, Ansible would be good to be used in initial phase, to estimate RTT end etc between different nodes which can help in decision making what simulation subsystems can allocated to different nodes.
In GitLab by @ghost on Sep 26, 2016, 14:14
Note for jitter evaluation:
Maybe we should use definition as in VoIP applications where jitter calculation is based on definition in IETF RFC 3550 (not IETF RFC 1889).
Thanks to Dennis we now have a set of benchmarks. See: #207
Now we only need to include them into our CI tests.
Done as part of https://github.com/VILLASframework/node-testing
In GitLab by @ghost on Jul 17, 2016, 01:31
Develop a setup to benchmark parameters like packet loss, throughput, delay etc. for different packet sizes and times of day/week. The output should show data with outliers. Current histogram code can be reused.
Initial idea can be to use Ansible with python to automate configuration file generation and test run on remote machines. Define the test format (user specification file) provided by user which can be initially a JSON file. Future improvements includes drag and drop GUI tool to generate the user specification file. Discuss with Marija, Steffen and Markus.