Open avaer opened 4 years ago
One idea for progressing on this would be to create a test page that can be accessed from the browser, running either Mocha or Jest, and then perhaps (would be very nice) running that page through Github Actions automation on each push.
Testing would probably also be a good path to officially documenting the APIs, while also encouraging the documentation to keep in sync with the code + tests.
Is using Mocha or Jest something you'd definitely prefer, or would using something like AVA (https://github.com/avajs/ava) be okay? An example of using AVA with something like Puppeteer is at https://github.com/avajs/ava/blob/master/docs/recipes/puppeteer.md
I'm pretty indifferent to the test framework.
We've been adding a bunch of tests recently but were a little unsure about some of the tests you mentioned! Could we just clarify a few things below?
null
or an error being thrown?Hopefully these make sense! 🙂
What type of bad data did you have in mind?
Also should bad data be handled as returning null or an error being thrown?
If it prevents the functionality from executing, probably throw an error. For promise-based apis this translates to a promise rejection.
Do you have any suggestions on where we could start with stress testing for memory leaks and measuring performance?
Load + unload a few packages many times to test for memory leaks.
For stress testing, we can dynamically add/remove/manage packages randomly and look for crashes.
Is there an example of using multiple XRPackageEngine's in the same page?
Not yet.
Do you have certain flows/areas in mind for testing race conditions?
Anything to do with changing package states. Things like adding/removing/setting matrix/wearing/adding files. I'm pretty sure there are going to be bugs revealed here.
We currently don't have unit tests for
XRPackage
andXRPackageEngine
, but we should.