Open pberkland opened 8 years ago
I disagree, that format makes it impossible to run the individual test from inside an IDE ( IntelliJ). As much as possible we should return JSON strings from our test to simulate what would be returned over toree to EclairJS-node. I understand this is more integration test than UT, but that is mostly what EclairJS-nashorn is (we are just wrapping and converting Java to JavaScript integration). We also need to continue to test ML, I understand the the values can change on each run, but we can check for successful completion. We also need to include our examples in the testing, we have found in the past that changes made to EclairJS-nashorn have broken our examples, it is very trivial to add the example to the test cases, and this was HUDGE help when were made the changes to support module loading.
I have understood that examples a necessary but, imho, i think we have to find a way to make this tests working on different machines. How can we proceed?
Agreed, which tests are failing on your machine and why? Some of the ML example tests just look for successful completion MLTest.LDAExample for example. I am unaware of anyone else on the team having issues with the tests failing, so we need to understand why they are failing on your machine before we make changes to the test cases.
I have created pr #244, i think the MlTest shoud simple check the successful completation for each test. I work on Ubuntu 14.04.3 LTS.ì on Virtualbox.
Currently creating a unit test for javascript is more work (almost x2) than it needs to be, because both java and javascript functions need to be created. The tests are also not easily understandable, because the code being tested is in javascript, but the validation/assertations are in java.
We should have a mechanism where the tests are purely defined in javascript, but still runnable via JUnit, so that tests can be run from IDE or Maven