For those interested in unit testing, unittest2 just got a facelift!
In particular, the output has been cleaned up to give more space to failures and less space to successes which makes room for adding timing information and other bells and whistles - failures for example are repeated at the end of the test run so they don't get lost in the immense amount of success spam that the current version prints.
We're also experimenting with a two-phase mode where tests are run separately after a discovery phase - apart from allowing progress indicators, nice output alignment etc it also paves the way for better test scheduling in the future, including running tests in separate / isolated processes.
unittest2 is "broadly" compatible with std/unittest, but works around many of its limitations - in particular, each test is instantiated in a separate proc meaning that test suites can be arbitrarily large rather than being limited by nim's limit on the number of global symbols - this also helps with with stack space and a few other technical issues that prevented us from using std/unittest at scale.
Other than this, it also comes with JUnit integration and some other bells and whistles - give it a try, if you haven't already..
Here's what it looks like:
[ 20/25] HTTP client testing suite ................s..... (14.2s)
[ 21/25] Asynchronous process management test suite ....................F. (14.5s)
===========================
/home/arnetheduck/status/nimbus-eth2/vendor/nim-chronos/build/testall 'Asynchronous process management test suite::File descriptors leaks test'
---------------------------
/home/arnetheduck/status/nimbus-eth2/vendor/nim-chronos/tests/testproc.nim(465, 27): Check failed: getCurrentFD() == markFD
getCurrentFD() was 3
markFD was 5
[FAILED ] ( 0.00s) File descriptors leaks test
Each test gets a . unless it's failed/skipped, and we have timing info for each suite so as to identify slow runners.
Failure information is retained and printed "later" so that in CI and terminals, you can actually find it without scrolling etc.
As a part of this refresh, the parallel test execution features had to go due to technical issues in the current implementation rendering them too unstable for practical use - they might reappear in the future, though that would have required a reimplementation from ground up, so it was easier to just remove them.
Give it a go :)
As someone that has mostly used std/unittest and a bit of testament for unit-testing (when I needed to test the same thing with different compiler flags) and thus has very little knowledge of Junit outside of a java-testing context:
What tooling does junit have that integrates with this and benefit you here? Question arose from your mention of it and on github it being written: > JUnit-compatible XML test reports for tooling integration
What tooling does junit have that integrates with this and benefit you here?
The obvious ones are CI tools like github actions and jenkins that have plugins that allow tracking statistics, making graphical / easry-to-read failure reports where you get HTML links to each failure and its output and link directly to code.. see https://github.com/marketplace/actions/junit-report-action and https://plugins.jenkins.io/junit/
Ditto IDE:s that can do similar tricks.
We use it for PR reports like so: https://github.com/status-im/nimbus-eth2/pull/5375#issuecomment-1700382577