In a CPython buildout, if I run bin/zope-testrunner —test-path=sources/nti.dataserver/src I get 1049 tests, 0 failures, 0 errors. This takes 10 minutes on my laptop.
However, if I run bin/compattest-nti.dataserver, which only includes the declared and transitive dependencies, I get 1049 tests, 13 failures, 13 errors and 10 skipped[1]. This takes 3 minutes.
This indicates that we're still missing dependencies needed for those tests to run. It's not entirely obvious off the bat which dependencies might those be; the errors are all over the place: A 404 instead of a 403 accessing a community, getting too many items back, a missing account creation link, etc.
FWIW, some profiling suggests that a lot of the extra time may be spent in SQLAlchemy, in turn suggesting that just having nti.analytics around to be imported/configured slows down the tests by up to a factor of 3.
In a CPython buildout, if I run
bin/zope-testrunner —test-path=sources/nti.dataserver/src
I get 1049 tests, 0 failures, 0 errors. This takes 10 minutes on my laptop.However, if I run
bin/compattest-nti.dataserver
, which only includes the declared and transitive dependencies, I get 1049 tests, 13 failures, 13 errors and 10 skipped[1]. This takes 3 minutes.This indicates that we're still missing dependencies needed for those tests to run. It's not entirely obvious off the bat which dependencies might those be; the errors are all over the place: A 404 instead of a 403 accessing a community, getting too many items back, a missing account creation link, etc.
FWIW, some profiling suggests that a lot of the extra time may be spent in SQLAlchemy, in turn suggesting that just having
nti.analytics
around to be imported/configured slows down the tests by up to a factor of 3.[1] The test report for isolated tests: