Description
Description
This is an idea for a feature to help alleviate problems with forgetting to remove .skip()
and .only()
in tests.
Problem
It is a common practice to run tests during development, but even the most diligent of us (and all of our fancy automation) can be easily fooled into thinking everything is okay when it isn't, by simple human error, when you forget to remove .only()
/ .skip()
from a test. It is easy to end up in a situation where you ship broken code because the tests technically succeeded but lots of tests were not actually run and the broken code didn't even get exercised at all.
Currently, AVA prints a reminder to the console when you use these methods, which is useful if you happen to notice it. But in some cases, that may not even be possible. For example, a common way to run npm test
is via git hooks, but my favorite GUI for git, Tower, only shows output from hooks when they fail. I like it that way, in general, though that means AVA's logs cannot be seen and thus it is even less obvious that some tests were not run.
Solution
One of the better solutions I have seen so far to prevent leaving in skip modifies is to grep
for them inside of a pre-push hook, as documented by @jegtnes here:
https://jegtnes.com/blog/use-git-pre-push-hooks-to-stop-test-suite-skips/
That got me thinking... AVA could make this nearly foolproof because it actually knows whether things are skipped, so much more reliably than grep
!
I propose a CLI command / flag that exits non-zero if .skip()
or .only()
are used.
ava --check-skips
You would then use that for your pre-push hook. If the tests pass and neither .skip()
nor .only()
were called, then git will push your commits, otherwise it will display a meaningful message from AVA, even in apps like Tower. All AVA has to do is provide this mode and exit with an error, if appropriate.
An alternative method to enable the mode could be an environment variable, to better support existing npm
scripts...
AVA_CHECK_SKIPS=true npm test
If this functionality were to be implemented, we could then automate it across all collaborators on a project with tools like husky. Fewer broken PRs, anyone? :)