ktls / af_ktls

Linux Kernel TLS/DTLS Module
GNU General Public License v2.0
156 stars 25 forks source link

Test suite #47

Open lancerchao opened 8 years ago

lancerchao commented 8 years ago

I wrote up a framework for what I think could be a good foundation for testing the AF_KTLS module. Check it out here. https://github.com/lancerchao/af_ktls-test

nmav commented 8 years ago

I'm bikeshedding here but Is there any advantage over cmocka? The latter does not require any C++ and is already used for unit testing the dtls sliding window. https://cmocka.org/

fridex commented 8 years ago

Awesome, Lance! Thank you.

While I was implementing and doing benchmarks of AF_KTLS, I used [1]. It is not the best possible solution, but it was/is useful (note some errors are not handled now - e.g. when a client does not connect to a TCP server, ...). I would like to reorganize this git repo and make the repo easier for development/testing/benchmarks, as suggested by @nmav in https://github.com/fridex/af_ktls/issues/17 Could we introduce one tool for automated tests and benchmarks not to diverge?

I don't have time to make a bigger review now, but few notes:

Take a look at [1] to save some time and efforts. Some parts could be reused.

[1] https://github.com/fridex/af_ktls-tool

nmav commented 8 years ago

Apart from the comment above on the framework that can certainly be the foundation of unit testing af_ktls (see Frido's comments as well), but good work. Something that caught my eye: On gen_random() Is it intentional to test only with alphanumeric data? If not you could use RAND_bytes() and avoid that function.

About //TODO: Not checking file content?... A wild idea could be instead of checking the file content to transmit the hash of the sent data (or send oob), so that the verifier can check the contents even of simple sends with random data.

lancerchao commented 8 years ago

The server basically echos the data from the client. Each client sends one (or more) strings and makes sure that the same string(s) was sent back. That way we can make sure the data is sent/transmitted correctly. It also helps that the reply is human-readable.

I think it is perfectly fine for benchmarking and correctness tests to be separate; that way we can tackle the two independently. I have taken a look at your tool, and I think we can work on combining the two.

I found C++ to be preferable in testing purposes because of the abundance of libraries and tools available, which was my main motivation to using Google Test.

Ignore the TODO. I fixed that by having the server reply.

nmav commented 8 years ago

I found C++ to be preferable in testing purposes because of the abundance of libraries and tools available, which was my main motivation to using Google Test.

Could you be more specific on that?

lancerchao commented 8 years ago

Could you be more specific on that?

For example, the boost libraries? Also the fact that C code can be smoothly embedded in C++, so we are not limiting ourselves in any way from choosing C++, apart from the dependency on a C++ compiler