- Summary
- Motivation
- Configuration
- Scala Backends
- JVM
- Scala.js
- Scala Native
- Testing
- Test Frameworks
- Test Framework Issues
- IntelliJ Idea Integration
- Implementation Notes
- Linking and Running Scala.js and Scala Native
- Testing
- Testing on Scala.js and Scala Native
- Node.js
- Test Run Data
- Test Events
- Test Ids
- Test Output
- Testing the Tests
- Dynamic Dependencies
- Scala 2.12
- Test Tagging
- JUnit4 for Scala.js and Scala Native
- Test Detection
- Nested Tasks and Test Cases
- Gradle Internals
- Mixing Backends
- Building for Multiple Scala Versions
- AsciiDoc
- Credits
This is a Gradle plugin that enhances Gradle Scala plugin by adding support for:
-
testing with framework(s) that implement sbt test interface (sbt.testing.Framework);
-
including sources specific to a version of Scala used;
-
building for a specific version of Scala;
-
compiling, running and testing Scala code using Scala.js and Scala Native backends;
-
code sharing between backends;
-
accessing data about the Scala backend and Scala version in the build script.
Plugin integrates with:
-
Gradle test task configuration, test filtering, tagging, logging and reporting;
-
IntelliJ test running and reporting;
-
IntelliJ Scala plugin to correctly handle code shared between backends.
Plugin works with:
-
Gradle
8.14.1; -
Scala.js
1.x; -
Node.js
16.x; -
Scala Native
0.5.x.
Plugin:
-
includes sources specific to a version of Scala used;
-
builds for a specific version of Scala;
-
adds necessary backend-specific dependencies;
-
adds necessary backend-specific
Scala
compiler plugins (for main and test code); -
adds necessary backend-specific
Scala
compiler parameters; -
for
Scala.js
andScala Native
, addslink
tasks; -
for
Scala.js
, retrieves and installs the configured version of Node.js; -
for
Scala.js
, installs the configuredNode.js
modules usingnpm
; -
exposes, via
scalaBackend
extension, data about the Scala backend and Scala version for use in the build script; -
augments the
test
task to work with sbt-enabled test frameworks; -
configures Scala compiler to include code shared between backends;
-
configures project artifacts to include shared code when needed;
-
sets the project artifact appendices in accordance with the accepted conventions.
Plugin is written in Scala 3, but the project that the plugin is applied to can use Scala 3, 2.13 or 2.12; however, plugin is not compatible with Gradle plugins written in Scala 2.12.
Gradle build file snippets below use the Groovy
syntax, not the Kotlin
one.
Accompanying example project that shows off some of the plugin’s capabilities is available: cross-compile-example.
I dislike untyped languages, so if I have to write Javascript
,
I want to be able to do it in my preferred language - Scala
;
thanks to Scala.js, this is possible.
I dislike sbt - the official build tool of Scala.js, which uses Scala.js sbt plugin; I want to be able to use my preferred build tool - Gradle.
Existing Scala.js Gradle plugin seems to be no longer maintained.
Hence, this plugin.
For years, I used Gradle ScalaTest plugin to run my Scala Tests. Since my plugin integrates with Gradle - and through it, with IntelliJ Idea - some of the issues that that plugin has my does not: Test events were not received, ASCII Control Characters Printed.
I never tried an alternative ScalaTest integration
scalatest-junit-runner,
and if you need JUnit5
that is probably the way to go,
since my plugin does not support JUnit5
(it does support Scala.js
and Scala Native
though :)).
Plugin is published on the Gradle Plugin Portal; to apply it to a Gradle project:
plugins {
id 'org.podval.tools.scalajs' version '0.8.1'
}
Plugin will automatically apply the Scala
plugin to the project,
so there is no need to manually list id 'scala'
in the plugins
block -
but there is no harm in it either.
Project using the plugin has to specify a version of Scala
for the Scala Gradle plugin to use.
One way to do it is to add Scala
library dependency explicitly,
and let the Scala
plugin infer the Scala version from it:
final String scalaVersion = '3.7.0'
dependencies {
implementation "org.scala-lang:scala3-library_3:$scalaVersion"
}
Another way is to set the Scala version on the Scala plugin’s extension scala
,
and let the Scala plugin add appropriate Scala library dependency automatically:
final String scalaVersion = '3.7.0'
dependencies {
scala.scalaVersion = scalaVersion
}
In the examples below, the latter approach is used exclusively because:
-
it is cleaner;
-
it is the future: the old, inference-based approach is going away (slowly; deprecated in Gradle 9);
-
some of the plugin’s functionality depends on the scala version being set on the
scala
extension.
Alongside the usual Scala source root scala
,
as in src/main/scala
and src/test/scala
,
plugin includes sources from Scala source roots specific to the Scala version in use:0;
for Scala version x.y.z
, additional Scala source roots are:
-
scala-x.y.z
; -
scala-x.y
; -
scala-x
;
This applies to Scala sources shared between the backends too.
Additional source are included both in Scala compilation and archives that package Scala sources.
When property org.podval.tools.scalajs.scalaVersion is set, e.g.:
$ ./gradlew -Porg.podval.tools.scalajs.scalaVersion=2.13.16
plugin builds the project for the specified version of Scala
and puts the build output under build/scala-<scala version>
.
Plugin exposes data about Scala version and Scala backend in use
via the scalaBackend
extension that it creates.
This can be used to simplify writing build scripts, e.g.:
import org.podval.tools.scalajsplugin.ScalaBackendExtension
final ScalaBackendExtension scalaVersion = scalaBackend
final String scalaJsVersion = '1.19.0'
dependencies {
testImplementation "org.scala-js:scalajs-junit-test-runtime_${scalaVersion.binary2}:$scalaJsVersion"
}
Plugin automatically adds certain dependencies to various Gradle configurations if they are not added explicitly.
Unless you want to override a version of some dependency that the plugin adds, the only dependencies you need to add to the project are the test framework(s) that you use.
As usual, artifact names have suffixes corresponding to the Scala version:
_3
, _2.13
or _2.12
. For the artifacts compiled by the non-JVM backends,
before the Scala version another suffix indicating the backend is inserted:
for Scala.js
- _sjs1
, for Scala Native
- _native0.5
.
In the examples below, the latest versions of all dependencies are used.
Plugin does not (yet?) support building for multiple Scala versions using only Gradle (unlike the Scala Multi-Version Plugin).
Plugin does provide enough functionality (Sources Specific to a Scala Version, Scala Version Override, Scala Backend Extension) to help automate building for multiple Scala versions using a script.
Plugin can be applied to:
-
JVM-only project (JVM);
-
Scala.js
orScala Native
project (Scala.js / Scala Native); -
mixed-backend project with some code shared between the backends (Mixed Backends).
Plugin, its name notwithstanding, provides benefits even if applied to a project that uses only Scala, without Scala.js or Scala Native, namely - ability to use any test frameworks(s) that support sbt test interface.
For the list of test frameworks supported by the plugin, see Test Frameworks.
To use the plugin in such a way, build.gradle
file for the project,
in addition to applying the plugin and setting the Scala version,
needs to list in the dependencies.testImplementation
the test framework(s) used.
Configuration of the test
task cannot have useJUnit
.
Any Gradle plugins providing integration with specific test frameworks must be removed from the project: plugin itself provides integration with test frameworks, in some cases - better than the dedicated test-framework-specific plugins ;)
Sources under src
are processed with one specific backend;
backend used is selected by the project property org.podval.tools.scalajs.backend
.
The value of this property is treated as case-insensitive.
This property must be set in the gradle.properties
file of the project
that applies the plugin: setting it in build.gradle
does not work.
If this property is set to Scala.js
or js
, Scala.js
backend is used.
If this property is set to Scala Native
or native
, Scala.js
backend is used.
If this property is set to JVM
or not set at all, JVM
backend is used,
making this setup equivalent to the JVM one.
For example, to use Scala.js
backend for the project
put the following into the gradle.properties
file of the project:
org.podval.tools.scalajs.backend=js
Plugin supports using multiple backends in the same project with some source files shared between them.
Backend-specific sources reside in backend-specific subprojects, and if directory with the shared sources exists, shared sources are included for the backend-specific compilation together with the backend-specific sources.
This mode is triggered when at least one of the backend-specific directories js
, jvm
, native
exists.
Not all backends have to be used all the time;
with only one backend used, this setup is equivalent to the Scala.js / Scala Native one
(and if that backend is jvm
- to the JVM one).
Backend-specific directories must also be included as subprojects in the settings.gradle
file;
directory shared
does not have to be included as a subproject in settings.gradle
for the Gradle build to work correctly,
but for the shared sources to be recognized in IntelliJ it must be.
For multi-module projects, including every subdirectory of every module using the plugin in multi-backend mode is not pretty nor modular:
include 'module'
include 'module:shared'
include 'module:js'
include 'module:jvm'
include 'module:native'
A better approach seems to be to create a separate settings-includes.gradle
file in the module:
include 'module:shared'
include 'module:js'
include 'module:jvm'
include 'module:native'
and apply it in the the overall settings.gradle
file:
include 'module'
apply from: 'module/settings-includes.gradle'
For convenience, plugin writes this file automatically ;)
Gradle project names of the subprojects can be changed, but the directory names
(js
, jvm
, native
, shared
) cannot: plugin looks up the subprojects
by their directory names, not by their project names.
Build script for the overall project is where:
-
plugin is applied,
-
Scala version is set,
-
any build logic that applies to the overall project resides.
Build scripts in the backend-specific directories are where:
-
backend-specific dependencies (including test frameworks) are added,
-
backend-specific tasks (including
link
andtest
) are configured, -
any build logic that applies only to specific backend resides.
There is no need (or point) to add build.gradle
file to the shared
directory:
it is just a container for the code shared between the backends.
There is no need (or point) to have an overall src
directory,
since backend-specific sources reside in the backend-specific subprojects,
and sources shared between backends - in shared
.
In this mode, plugin:
-
applies itself to each of the backend-specific subprojects (so there is no need to apply it manually in the backend-specific
build.gradle
); -
propagates the Scala version set in the overall project’s
build.gradle
to each of the backend-specific subprojects (so there is no need to set it manually in the backend-specificbuild.gradle
); -
configures appropriate backend for each of the backend-specific subprojects (so there is no need to set property
org.podval.tools.scalajs.backend
manually in the backend-specificgradle.properties
); -
disables all source tasks and unregisters all Scala sources in the overall subproject;
-
applies
scala
plugin to theshared
subproject; -
disables all tasks in the
shared
subproject.
Project layout for such setup is:
project (6)
+--- settings.gradle (1)
+--- build.gradle (2)
+--- shared
| \--- src (4)
+--- js
| +--- build.gradle (3)
| \--- src (5)
+--- jvm
| +--- build.gradle (3)
| \--- src (5)
\--- native
+--- build.gradle (3)
\--- src (5)
-
settings file where backend-specific and shared subprojects are included
-
build script of the overall project
-
build scripts of the backend-specific projects
-
sources shared between backends
-
sources specific to a backend
-
there are no sources in the overall project
When running on JVM, plugin adds SBT Test Interface
org.scala-sbt:test-interface:1.0
to the testRuntimeOnly
configuration: it is used by the plugin to run the tests,
and is normally brought in by the test frameworks themselves,
but since ScalaTest
does not bring it in,
plugin adds it.
dependencies {
testRuntimeOnly 'org.scala-sbt:test-interface:1.0'
}
If org.scala-js:scalajs-library
dependency is specified explicitly,
plugin uses its version for all the Scala.js dependencies that it adds.
Plugin creates scalajs
configuration
for Scala.js
dependencies used by the plugin itself.
The table below lists what is added to what configurations.
Name | group:artifact | Backend | Configuration | Notes |
---|---|---|---|---|
Compiler Plugin |
org.scala-js:scalajs-compiler |
JVM Scala 2 |
scalaCompilerPlugins |
only for Scala 2 |
JUnit Compiler Plugin |
org.scala-js:scalajs-junit-test-plugin |
JVM Scala 2 |
testScalaCompilerPlugins |
only for Scala 2 and only if JUnit4 for Scala.js is used |
Linker |
org.scala-js:scalajs-linker |
JVM Scala 2 |
scalajs |
|
Node.js Environment |
org.scala-js:scalajs-env-jsdom-nodejs |
JVM Scala 2 |
scalajs |
|
Test Adapter |
org.scala-js:scalajs-sbt-test-adapter |
JVM Scala 2 |
scalajs |
|
Scala Library for Scala.js |
org.scala-lang:scala3-library |
Scala.js |
implementation |
only for Scala 3 |
Library |
org.scala-js:scalajs-library |
JVM Scala 2 |
implementation |
|
DOM Library |
org.scala-js:scalajs-dom |
Scala.js |
implementation |
|
Test Bridge |
org.scala-js:scalajs-test-bridge |
JVM Scala 2 |
testRuntimeOnly |
The following Gradle build script fragment manually adds all Scala.js dependencies that the plugin adds automatically:
import org.podval.tools.scalajsplugin.ScalaBackendExtension
final ScalaBackendExtension scalaVersion = scalaBackend
final String scalaJsVersion = '1.19.0'
dependencies {
implementation "org.scala-js:scalajs-library_${scalaVersion.binary2}:$scalaJsVersion" // (1)
implementation "org.scala-js:scalajs-dom_sjs1_${scalaVersion.binary}:2.8.0"
if (scalaVersion.scala3) {
implementation "org.scala-lang:scala3-library_sjs1_${scalaVersion.binary}:${scalaVersion.version}"
}
scalajs "org.scala-js:scalajs-linker_${scalaVersion.binary2}:$scalaJsVersion"
scalajs "org.scala-js:scalajs-sbt-test-adapter_${scalaVersion.binary2}:$scalaJsVersion"
scalajs "org.scala-js:scalajs-env-jsdom-nodejs_${scalaVersion.binary2}:1.1.0"
if (!scalaVersion.scala3) {
scalaCompilerPlugins "org.scala-js:scalajs-compiler_${scalaVersion.version}:$scalaJsVersion"
}
if (!scalaVersion.scala3) {
testScalaCompilerPlugins "org.scala-js:scalajs-junit-test-plugin_${scalaVersion.version}:$scalaJsVersion" // (2)
}
testRuntimeOnly "org.scala-js:scalajs-test-bridge_${scalaVersion.binary2}:$scalaJsVersion"
}
-
if added manually, sets
scalaJsVersion
used for other automatically added dependencies -
only if JUnit4 for Scala.js is in the
testImplementation
configuration
To support Scala.js, Scala compiler needs to be configured to produce both the class
and sjsir
files.
If the project uses Scala 3, all it takes is to pass -scalajs
option
to the Scala compiler, since Scala 3 compiler has Scala.js support built in:
tasks.withType(ScalaCompile) {
scalaCompileOptions.with {
additionalParameters = [ '-scalajs' ]
}
}
Plugin automatically adds this option to the main and test Scala compilation tasks if it is not present.
If the project uses Scala 2, Scala.js compiler plugin dependency needs to be declared:
dependencies {
scalaCompilerPlugins "org.scala-js:scalajs-compiler_$scalaVersion:1.19.0"
}
Plugin does this automatically unless a dependency on
org.scala-js:scalajs-compiler
is declared explicitly.
If the project uses Scala 2 and JUnit 4 for Scala.js, a JUnit Scala compiler plugin is also needed (JUnit4 for Scala.js and Scala Native):
dependencies {
testScalaCompilerPlugins "org.scala-js:scalajs-junit-test-plugin_$scalaVersion:1.19.0"
}
Plugin adds this automatically also.
There is no need to add -Xplugin:
Scala compiler parameters for the compiler plugins.
For linking of the main code, plugin adds link
task of type
org.podval.tools.scalajsplugin.scalajs.ScalaJSLinkMainTask;
all tasks of this type automatically depend on the classes
task.
For linking of the test code, plugin adds testLink
task of type
org.podval.tools.scalajsplugin.scalajs.ScalaJSLinkTestTask;
all tasks of this type automatically depend on the testClasses
task.
Link tasks exposes a property JSDirectory
that points to a directory
with the resulting JavaScript, so that it can be, for example, copied where needed:
link.doLast {
project.sync {
from link.JSDirectory
into jsDirectory
}
}
Link tasks have a number of properties that can be used to configure linking. Configurable properties with their defaults are:
link {
optimization = 'Fast' // one of: 'Fast', 'Full'
moduleKind = 'NoModule' // one of: 'NoModule', 'ESModule', 'CommonJSModule'
moduleSplitStyle = 'FewestModules' // one of: 'FewestModules', 'SmallestModules'
prettyPrint = false
}
Setting optimization
to Full
enables:
-
Semantics.optimized
; -
checkIR
; -
Closure Compiler (unless
moduleKind
is set toESModule
).
For ScalaJSLinkMainTask
tasks, a list of module initializers may also be configured:
moduleInitializers {
main {
className = '<fully qualified class name>'
mainMethodName = 'main'
mainMethodHasArgs = false
}
}
Name of the module initializer ('main' in the example above) becomes the module id.
Plugin adds run
task for running the main code
(if it is an application and not a library);
this task automatically depends on the link
task.
Additional tasks of type
org.podval.tools.scalajsplugin.scalajs.ScalaJSRunMainTask
can be added manually;
their dependency on a corresponding ScalaJSLinkMainTask
task must be set manually too.
For running Scala.js
code and tests, plugin uses Node.js
.
In Scala.js mode, plugin adds node
extension to the project.
This extension can be used to specify the version of Node.js to use and Node modules to install:
node {
version = '22.15.1'
modules = ['jsdom']
}
If Node.js version is not specified, plugin uses "ambient" Node.js - the one installed on the machine where it is running, or, if none is available, installs the default version (22.15.1). If Node.js version is specified, plugin install that version.
Node is installed under ~/.gradle/nodejs
.
If no Node modules to install are listed, plugin installs the jsdom
module,
which is required for org.scala-js:scalajs-env-jsdom-nodejs
.
To get better traces, one can add source-map-support
module.
Node modules for the project are installed in the node_modules
directory in the project root.
If package.json
file does not exist, plugin runs npm init private
.
Plugin adds tasks node
and npm
for executing node
and npm
commands
using the same version of Node.js that is used by the plugin;
those tasks can be used from the command line like this:
./gradlew npm --npm-arguments 'version'
./gradlew node --node-arguments '...'
If org.scala-native:scala3lib
(for Scala 3) or
org.scala-native:scalalib
(for Scala 2) dependency is specified explicitly,
plugin uses its version for all the Scala Native dependencies that it adds.
Plugin creates scalanative
configuration
for Scala Native
dependencies used by the plugin itself.
The table below lists what is added to what configurations.
Name | group:artifact | Backend | Configuration | Notes |
---|---|---|---|---|
Compiler Plugin |
org.scala-native:nscplugin |
JVM |
scalaCompilerPlugins |
|
JUnit Compiler Plugin |
org.scala-native:junit-plugin |
JVM |
testScalaCompilerPlugins |
only if JUnit4 for Scala Native is used |
Linker |
org.scala-native:tools |
JVM |
scalanative |
|
Test Adapter |
org.scala-native:test-runner |
JVM |
scalanative |
|
Library |
org.scala-native:scala3lib |
Scala Native |
implementation |
only for Scala 3 |
Library |
org.scala-native:scalalib |
Scala Native |
implementation |
only for Scala 2 |
Test Bridge |
org.scala-native:test-interface |
Scala Native |
testRuntimeOnly |
|
Native Library |
org.scala-native:nativelib |
Scala Native |
implementation |
|
C Library |
org.scala-native:clib |
Scala Native |
implementation |
|
Posix Library |
org.scala-native:posixlib |
Scala Native |
implementation |
|
Windows Library |
org.scala-native:windowslib |
Scala Native |
implementation |
|
Java Library |
org.scala-native:javalib |
Scala Native |
implementation |
|
Aux Library |
org.scala-native:auxlib |
Scala Native |
implementation |
The following Gradle build script fragment manually adds all Scala Native dependencies that the plugin adds automatically:
import org.podval.tools.scalajsplugin.ScalaBackendExtension
final ScalaBackendExtension scalaVersion = scalaBackend
final String scalaNativeVersion = '0.5.7'
dependencies {
if (scalaVersion.scala3) {
implementation "org.scala-native:scala3lib_native0.5_${scalaVersion.binary}:${scalaVersion.version}+$scalaNativeVersion" // (1)
} else {
implementation "org.scala-native:scalalib_native0.5_${scalaVersion.binary}:${scalaVersion.version}+$scalaNativeVersion" // (1)
}
implementation "org.scala-native:nativelib_native0.5_${scalaVersion.binary}:$scalaNativeVersion"
implementation "org.scala-native:javalib_native0.5_${scalaVersion.binary}:$scalaNativeVersion"
implementation "org.scala-native:clib_native0.5_${scalaVersion.binary}:$scalaNativeVersion"
implementation "org.scala-native:posixlib_native0.5_${scalaVersion.binary}:$scalaNativeVersion"
implementation "org.scala-native:windowslib_native0.5_${scalaVersion.binary}:$scalaNativeVersion"
implementation "org.scala-native:auxlib_native0.5_${scalaVersion.binary}:$scalaNativeVersion"
scalanative "org.scala-native:tools_${scalaVersion.binary}:$scalaNativeVersion"
scalanative "org.scala-native:test-runner_${scalaVersion.binary}:$scalaNativeVersion"
scalaCompilerPlugins "org.scala-native:nscplugin_${scalaVersion.version}:$scalaNativeVersion"
testScalaCompilerPlugins "org.scala-native:junit-plugin_${scalaVersion.version}:$scalaNativeVersion" // (2)
testRuntimeOnly "org.scala-native:test-interface_native0.5_${scalaVersion.binary}:$scalaNativeVersion"
testImplementation "org.scala-native:junit-runtime_native0.5_${scalaVersion.binary}:$scalaNativeVersion"
}
-
if added manually, sets
scalaNativeVersion
used for other automatically added dependencies -
only if JUnit4 for Scala Native is in the
testImplementation
configuration
To support Scala Native, Scala compiler needs to be configured to produce both the class
and nir
files.
Scala.js compiler plugin dependency needs to be declared:
dependencies {
scalaCompilerPlugins "org.scala-native:nscplugin_$scalaVersion:0.5.7"
}
Plugin does this automatically unless a dependency on
org.scala-native:nscplugin
is declared explicitly.
If the project uses JUnit 4 for Scala Native, a JUnit Scala compiler plugin is also needed (JUnit4 for Scala.js and Scala Native):
dependencies {
testScalajsCompilerPlugins "org.scala-native:junit-plugin_$scalaVersion:1.19.0"
}
Plugin adds this automatically also.
There is no need to add -Xplugin:
Scala compiler parameters for the compiler plugins.
For linking of the main code, plugin adds link
task of type
org.podval.tools.scalajsplugin.scalanative.ScalaNativeLinkMainTask;
all tasks of this type automatically depend on the classes
task.
For linking of the test code, plugin adds testLink
task of type
org.podval.tools.scalajsplugin.scalanative.ScalaNativeLinkTestTask;
all tasks of this type automatically depend on the testClasses
task.
Link tasks exposes a property NativeDirectory
that points to a directory
with the Scala Native Linker output, so that it can be copied where needed.
Link tasks have a number of properties that can be used to configure linking. Configurable properties with their defaults are:
TODO verify optimize default value
link {
mode = 'debug' // one of: 'debug', 'release-fast', 'release-size', 'release-full'
lto = 'none' // one of: 'none', 'thin', 'full'
gx = 'immix' // one of: 'none', 'boehm', 'immix', 'commix'
optimize = false
}
If not set explicitly, properties are set from the environment variables:
-
mode -
SCALANATIVE_MODE
-
lto -
SCALANATIVE_LTO
-
gc -
SCALANATIVE_GC
-
optimize -
SCALANATIVE_OPTIMIZE
For ScalaNativeLinkMainTask
tasks, property mainClass
may also be configured.
This is the class that will be run.
Plugin adds run
task for running the main code
(if it is an application and not a library);
this task automatically depends on the link
task.
Additional tasks of type
org.podval.tools.scalajsplugin.scalanative.ScalaNativeRunMainTask
can be added manually;
their dependency on a corresponding ScalaNativeLinkMainTask
task must be set manually too.
Test task added by the plugin is derived from the normal Gradle test
task,
and can be configured in the traditional way - with some limitations:
-
plugin applies its own Gradle test framework (
useSbt
) to each test task; re-configuring the Gradle test framework (viauseJUnit
,useTestNG
oruseJUnitPlatform
) is not supported; -
isScanForTestClasses
must be at its default valuetrue
. -
Scala.js and Scala Native tests must run in the same JVM where they are discovered, so they are not forked, and forking configuration is ignored.
Dry run (test.dryRun=true
or --test-dry-run
command line option) is supported.
Test filtering and tagging are supported to the extent that the individual test frameworks support them; see Test Filtering, Test Tagging and Test Frameworks.
If there is a need to have test runs with different configurations, more testing tasks can be added manually.
For JVM, the type of the test task is
org.podval.tools.scalajsplugin.jvm.JvmTestTask.
Any such task will automatically depend on the testClasses
task (and testRuntimeClassPath
).
For Scala.js the type of the test task is
org.podval.tools.scalajsplugin.scalajs.ScalaJSTestTask.
Such test tasks have to depend on a
org.podval.tools.scalajsplugin.scalajs.ScalaJSLinkTestTask task
.
The test
task added by the plugin does it automatically;
for manually added tasks this dependency has to be added manually.
For Scala Native the type of the test task is
org.podval.tools.scalajsplugin.scalanative.ScalaNativeTestTask.
Such test tasks have to depend on a
org.podval.tools.scalajsplugin.scalanative.ScalaNativeLinkTestTask task
.
The test
task added by the plugin does it automatically;
for manually added tasks this dependency has to be added manually.
Gradle uses three sets of patterns to filter tests by names;
two of them - includeTestsMatching
and excludeTestsMatching
-
are set in the Gradle build file:
test {
filter {
includeTestsMatching "org.podval.tools.test.SomeTestClass.success"
includeTestsMatching "org.podval.tools.test.SomeTestClass.failure"
excludeTestsMatching "OtherTestClass"
}
}
The third one is set via a command-line option --tests
.
Inclusion rules are:
-
if both build file and the command line inclusions are specified, to be included, a test must match both.
-
if no inclusions nor exclusions are specified, all tests are included.
-
if only inclusions are specified, only tests matching one of them are included.
-
if only exclusions are specified, only tests not matching any of them are included.
-
if both inclusions and exclusions are specified, only tests matching one of the inclusions and not matching any of the exclusions are included.
Gradle inclusion/exclusion patterns can contain wildcards "*"; semantics of matching against those patterns is complicated, sometimes surprising and difficult (for me) to understand; that is why I followed Gradle implementation as closely as possible. Plugin implements test class inclusion/exclusion itself, but individual test case inclusion/exclusion is handled by the test framework used.
SBT test interface that the plugin uses to communicate with the test frameworks has means of expressing that a test case with specific name is to be included (TestSelector) and that test cases whose names contain a specific string are to be included (TestWildcardSelector); it does not have any means of expressing which test cases are to be excluded.
Plugin does not have access to the list of test case names (which are framework-dependent), so, even though I try to translate Gradle filtering to the SBT test interface filtering as close as possible, when test case filtering is involved, this translation can in general case lose fidelity. My immediate goal was to make sure the filtering scenarios that are used in practice work as intended; turns out, infidelities in the implementation of test case filtering in specific test frameworks make even that impossible in some cases, as is detailed below.
The following patterns specify test classes to run:
-
"*"
: all tests, just as if no includes are specified; -
"*IntegrationTest"
: classes whose named end with "IntegrationTest"; -
"Scala*"
: classes whose name starts with "Scala"; -
"org.podval.tools.test.Scala*"
: classes in specified package whose name starts with "Scala"; -
"org.podval.tools.test.*"
: tests in specified package (used by IntelliJ Idea, see IntelliJ Idea Integration); -
"org.podval.tools.test.ScalaTest"
: tests in specified class (used by IntelliJ Idea, see IntelliJ Idea Integration).
All these patterns work as intended.
The following patterns specify test cases to run:
-
"org.podval.tools.test.SomeTestClass.success"
: specified test case in specified class (used by IntelliJ Idea, see IntelliJ Idea Integration); -
"org.podval.tools.test.SomeTestClass.succ*"
: test cases whose names start with "succ" in specified class.
With these patterns, what actually happens depends on the fidelity with which test framework used implements even the restricted test case selection means of the SBT test interface.
Names of the tags to include and exclude in the run are specified in:
test {
useSbt {
includeCategories = ["itag1", "itag2"]
excludeCategories = ["etag1", "etag2"]
}
}
Inclusion rules are:
-
if no inclusions nor exclusions are specified, all tests are included.
-
if only inclusions are specified, only tests tagged with one of them are included.
-
if only exclusions are specified, only tests not tagged with any of them are included.
-
if both inclusions and exclusions are specified, only tests tagged with one of the inclusions and not tagged with any of the exclusions are included.
When running some test methods explicitly included by a filter, I do not want to see skipped methods mentioned in the test report just as I do not want to see other skipped test classes there.
I do want to see tests explicitly ignored in code (e.g., in ScalaTest, or JUnit4’s falsified assumptions).
During a dry run, though, I want to see everything that was skipped,
including test classes that were skipped entirely;
for such, a test case named dry run
is reported as skipped.
Some test frameworks have a notion of nested test suites, where nesting test class aggregates nested test classes.
Plugin supports such a scenario and, when test framework involved provides sufficient information about the tests run, attributes test cases from the nested suites to them: test report will have no test cases for the nesting class; instead, test cases will be reported for the nested classes they belong to.
Plugin replaces the test
task with one that supports running
sbt-compatible test frameworks; multiple test frameworks can be used at the same time.
TestNG is not supported: its SBT interface is long since abandoned.
JUnit5 is not supported, since it insists on using its own test discovery mechanism. Both Gradle and IntelliJ Idea support JUnit5 out of the box, and since there is no JUnit5 for Scala.js, there is not much the plugin can add anyway.
Framework-specific information for the frameworks that are supported follows.
Name | group:artifact | Backends | Version | Notes |
---|---|---|---|---|
JUnit4 |
com.github.sbt:junit-interface |
jvm |
0.13.3 |
Java |
JUnit4 for Scala.js |
org.scala-js:scalajs-junit-test-runtime |
js |
1.19.0 |
Scala 2, no backend |
JUni4 for Scala Native |
org.scala-native:junit-runtime |
native |
0.5.7 |
|
MUnit |
org.scalameta:munit |
jvm, js, native |
1.1.1 |
|
ScalaCheck |
org.scalacheck:scalacheck |
jvm, js, native |
1.18.1 |
|
ScalaTest |
org.scalatest:scalatest |
jvm, js, native |
3.2.19 |
|
specs2 |
org.specs2:specs2-core |
jvm, js |
5.6.3 |
latest for Scala 2: 4.20.9 |
uTest |
com.lihaoyi:utest |
jvm, js, native |
0.8.5 |
|
ZIO Test |
dev.zio:zio-test-sbt |
jvm |
2.1.18 |
see issues/37 |
The following Gradle build script fragment adds all test framework dependencies that fit the Scala version and backend:
import org.podval.tools.scalajsplugin.ScalaBackendExtension
final ScalaBackendExtension backend = scalaBackend
final ScalaBackendExtension scalaVersion = scalaBackend
final String scalaJSVersion = '1.19.0'
final String scalaNativeVersion = '0.5.7'
dependencies {
if (backend.jvm) {
testImplementation "com.github.sbt:junit-interface:0.13.3"
}
if (backend.js) {
testImplementation "org.scala-js:scalajs-junit-test-runtime_${scalaVersion.binary2}:$scalaJSVersion"
}
if (backend.native) {
testImplementation "org.scala-native:junit-runtime${backend.suffix}_${scalaVersion.binary}:$scalaNativeVersion"
}
testImplementation "org.scalameta:munit${backend.suffix}_${scalaVersion.binary}:1.1.1"
testImplementation "org.scalacheck:scalacheck${backend.suffix}_${scalaVersion.binary}:1.18.1"
testImplementation "org.scalatest:scalatest${backend.suffix}_${scalaVersion.binary}:0.8.5"
testImplementation "com.lihaoyi:utest${backend.suffix}_${scalaVersion.binary}:0.8.5"
if (backend.jvm) {
testImplementation "dev.zio:zio-test-sbt${backend.suffix}_${scalaVersion.binary}:2.1.18"
}
if (backend.jvm || backend.js) {
if (scalaVersion.scala3) {
testImplementation "org.specs2:specs2-core${backend.suffix}_${scalaVersion.binary}:5.6.3"
} else {
testImplementation "org.specs2:specs2-core${backend.suffix}_${scalaVersion.binary}:4.20.9"
}
}
}
JUnit4 SBT interface (com.github.sbt:junit-interface)
is a separate project from JUnit4 itself;
SBT interface dependency brings in the underlying framework dependency
junit:junit
transitively;
its version can be overridden in the Gradle build script.
-
test filtering: works fine;
-
ignoring a test: not supported;
-
assumptions: if falsified, result in a test being skipped:
org.junit.Assume.assumeTrue(false)
;
Tag tests with classes or traits
that do not have to be derived from anything JUnit4
-specific;
in the Gradle build file, excludeCategories
and includeCategories
list fully-qualified names of tagging classes or traits:
trait IncludedTest
trait ExcludedTest
@org.junit.experimental.categories.Category(Array(
classOf[org.podval.tools.test.IncludedTest],
classOf[org.podval.tools.test.ExcludedTest]
))
@Test def excluded(): Unit = ()
JUnit4 uses an annotation on the nesting suite to indicate that it contains nested suites:
@org.junit.runner.RunWith(classOf[org.junit.runners.Suite])
and another annotation that lists the nested suites:
@org.junit.runners.Suite.SuiteClasses(Array(
classOf[JUnit4Nested]
))
For example, JUnit4Nesting
contains JUnit4Nested
:
@org.junit.runner.RunWith(classOf[org.junit.runners.Suite])
@org.junit.runners.Suite.SuiteClasses(Array(
classOf[JUnit4Nested]
))
class JUnit4Nesting {
}
import org.junit.Test
import org.junit.Assert.assertTrue
final class JUnit4Nested {
@Test def success(): Unit = assertTrue("should be true", true)
@Test def failure(): Unit = assertTrue("should be true", false)
}
By default, JUnit4
's sbt
framework
ignores the
org.junit.runners.Suite
runner; plugin supplies an appropriate
arguments to JUnit4
to enable it.
JUnit4 for Scala.js is a framework distinct from JUnit4: it is a partial translation/re-implementation of JUnit4 circa 2015 and has different capabilities.
-
test filtering: does not support test case selectors and runs all test cases in the class;
-
test tagging: not supported;
-
nested suites: not supported;
-
ignoring tests: not supported;
-
assumptions: not supported;
JUnit4 for Scala Native is a framework distinct from JUnit4: it is a port of the JUnit4 for Scala.js, which is a partial translation/re-implementation of JUnit4 circa 2015 and has different capabilities.
-
test filtering: does not support test case selectors and runs all test cases in the class;
-
test tagging: not supported;
-
nested suites: not supported;
-
ignoring tests: not supported;
-
assumptions: not supported;
MUnit uses JUnit internally, and brings in the underlying framework dependency transitively:
on JVM - junit:junit
;
on Scala.js - org.scala-js:scalajs-junit-test-runtime
;
on Scala Native - org.scala-native:junit-runtime
.
its version can be overridden in the Gradle build script.
-
test filtering: works fine on
JVM
; onScala.js
, does not support test case selectors and runs all test cases in the class. -
nested suites: not supported;
-
assumptions: not supported;
-
ignoring a test
test("test".ignore) {}
;
MUnit is based on JUnit4, so it supports the Category
-based exclusion and inclusion;
since on Scala.js MUnit uses JUnit4 for Scala.js
,
which does not support this mechanism,
MUnit does not support it either.
Plugin does not use Category
-based mechanism;
MUnit provides a different, Tag
-based mechanism,
and that is what plugin uses.
Tag tests with values that are instances of munit.Tag
:
val include = new munit.Tag("org.podval.tools.test.ExcludedTest")
val exclude = new munit.Tag("org.podval.tools.test.ExcludedTest")
test("excluded".tag(include).tag(exclude)) {}
When tagging classes used for inclusion/exclusion are not available,
MUnit crashes with a ClassNotFound
.
-
test filtering functionality is not available: issue;
-
test tagging: not supported, but if it is used via another test framework - like
ScalaTest
orspecs2
- test tagging mechanisms provided by that framework can be used; -
assumptions: not supported;
-
ignoring a test: not supported;
In ScalaCheck, nesting is accomplished by using
org.scalacheck.Properties.include()
:
object ScalaCheckNesting extends org.scalacheck.Properties("ScalaCheckNesting") {
include(ScalaCheckNested)
}
object ScalaCheckNested extends org.scalacheck.Properties("ScalaCheckNested") {
property("success") = org.scalacheck.Prop.passed
property("failure") = org.scalacheck.Prop.falsified
}
With ScalaCheck, nested test cases are attributed to the nesting suite - and there is nothing that can be done about it, since ScalaCheck itself does not keep information about which class a property belongs to; see typelevel/scalacheck#1107.
-
test filtering: works fine;
-
assumptions: not supported;
-
ignoring a test:
ignore should "be ignored"
;
Tag tests with objects that extend org.scalatest.Tag
:
object Include extends org.scalatest.Tag("org.podval.tools.test.IncludedTest")
object Exclude extends org.scalatest.Tag("org.podval.tools.test.ExcludedTest")
"excluded" should "not run" taggedAs(Include, Exclude) in { true shouldBe false }
-
test filtering: works fine;
-
nested suites: not supported;
-
assumptions: not supported;
-
ignoring a test: not supported;
-
test filtering: does not support test case selectors and runs all test cases in the class.
-
test tagging: not supported;
Issues identified and fixed: bug.
Currently, not supported on Scala.js nor Scala Native because of a bug issues/37.
-
test filtering: treats specific test case inclusions as wildcards, and instead of running just the named test cases runs all whose names contain the specified string, because the only test case name-based filtering that ZIO Test supports is "search terms", which work as wildcards;
-
ignoring a test:
test("ignored") { … } @@ zio.test.TestAspect.ignore
; -
assumption:
test("assumption") { … } @@ zio.test.TestAspect.ifProp("property")(string ⇒ false)
Tag tests with tag names using TestAspect.tag
:
test("tagged") { ... } @@ TestAspect.tag(
"org.podval.tools.test.IncludedTest",
"org.podval.tools.test.ExcludedTest"
)
import zio.test._
object ZIOTestNesting extends ZIOSpecDefault {
override def spec: Spec[TestEnvironment, Any] = suite("ZIOTestNesting")(
ZIOTestNested.spec
)
}
object ZIOTestNested extends ZIOSpecDefault {
override def spec: Spec[TestEnvironment, Any] = suite("ZIOTestNested")(
test("success") { assertTrue(1 == 1) },
test("failure") { assertTrue(1 == 0) },
)
}
It is assumed that the IDE is set up to use Grade to run tests etc.
On JVM, whatever you can run from Idea you can also debug; Scala.js code runs on Node.js, so there is no debugging it - breakpoints have no effect; nor do they on Scala Native.
As with any other Gradle project imported into Idea, you can run Gradle tasks.
IntelliJ lets you run objects with main methods using either:
-
object node in the project tree or
-
gutter icon in the object’s file
On Scala.js or Scala Native, objects can not be run this way:
the code needs to be compiled and linked for the appropriate backend.
This is what the run
task added by the plugin is for.
As usual, when you run tests:
-
results are displayed in tree form
-
test counts are displayed.
As usual, you can run all tests from the project tree using any of the nodes:
<root>
src
test
scala
As usual, you can run all tests from a package using the package’s node in the project tree. Idea supplies Gradle test filter "selected.package.*".
As usual, you can run individual test for the frameworks Idea recognizes using either:
-
test’s node in the project tree or
-
gutter icon in the test’s file
Idea supplies Gradle test filter "fully.qualified.TestClass".
From the test frameworks this plugin supports, Idea recognizes:
-
JUnit4 (for some reason, tests cannot be run from the project tree)
-
JUnit4 for Scala.js
-
JUnit4 for Native
-
MUnit
Scala plugin for Idea recognizes (but does nor reflect the results of the previous run in the gutter icon of the test):
-
ScalaTest
-
Specs2
-
uTest
Not recognized are:
-
ScalaCheck
-
ZIO Test
Since ZIO Test
tests are objects with main method,
they can be run from Idea (when not using Scala.js),
but there is no test result tree nor test counts displayed,
and since Gradle is not involved, no test reports.
For JUnit4
, JUnit4 for Scala.js
, and JUnit4 for Scala Native
Idea also recognizes individual test methods within a test class;
they can be run using their gutter icons.
Idea supplies Gradle test filter "fully.qualified.TestClass.testMethod".
For MUnit
, only the first test method gets a gutter icon,
but Idea supplies test filter "fully.qualified.TestClass",
so that icon runs the whole class, not the test method it is for.
For ScalaTest
, every test method gets a gutter icon,
but Idea supplies test filter "fully.qualified.TestClass",
so that icon runs the whole class, not the test method it is for.
For specs2
and uTest
, there are no gutter icons for individual test methods.
It is reasonably easy, if repetitive, to configure the Scala compiler and add needed Scala.js dependencies by hand; what really pushed me to build this plugin is the difficulty and ugliness involved in manually setting up Scala.js linking in a Gradle build script.
For Scala.js, I perused:
For Scala.Native, I perused: - Scala Native sbt plugin - Mill (a little)
To figure out how sbt
itself integrates with testing frameworks, I had to untangle some sbt
code, including:
-
sbt.Defaults
-
sbt.Tests
-
sbt.TestRunner
-
sbt.ForkTests
-
org.scalajs.sbtplugin.ScalaJSPluginInternal
Turns out, internals of sbt
are a maze of twisted (code) passages,
all alike, where pieces of code are stored in key-value maps,
and addition of such maps is used as an override mechanism.
What a disaster!
There are two testing interfaces in org.scala-sbt:test-interface:1.0
;
I use the one used by the Scala.js sbt plugin - presumably the "new" one ;)
Just being able to run the tests with no integration with
Gradle or IntelliJ Idea seemed suboptimal,
so I decided to look into proper integrations of things like
org.scala-js:scalajs-sbt-test-adapter
and
org.scala-sbt:test-interface.
I perused:
This took by far the most of my time (and takes up more than 3/4 of the plugin code), and uncovered a number of surprises.
IntelliJ Idea instruments Gradle test task with its IJTestEventLogger
-
but only if the task is of type org.gradle.api.tasks.testing.Test
,
so that is what I derive my test task from.
Once I worked out how to integrate tests on Scala.js with Gardle and IntelliJ Idea, it was reasonably easy to re-use this integration to run tests using sbt-compatible frameworks without any Scala.js involved - in plain Scala projects.
Scala.js and Scala Native tests must be run in the same JVM
where their frameworks were instantiated
(see
org.scalajs.sbtplugin.ScalaJSPluginInternal,
scala.scalanative.sbtplugin.ScalaNativePluginInternal
).
TestExecuter
makes sure that the tests are not forked,
and TestTask
overrides
org.gradle.api.tasks.testing.Test.getMaxParallelForks()
to return 1
on Scala.js
to prevent MaxNParallelTestClassProcessor
from forking.
On JVM, exceptions are serialized in Gradle’s org.gradle.internal.serialize.ExceptionPlaceholder
, which contains lots of details;
on Scala.js, org.scalajs.testing.common.Serializer.ThrowableSerializer
turns them all into org.scalajs.testing.common.Serializer$ThrowableSerializer$$anon$3
;
since source mapping is used only on Scala.js,
there is no point trying to preserve the original exception:
it is already lost;
so just wrap what remains in TestExecutionException
.
Node.js
support that the plugin provides
is heavily inspired by (read: copied and reworked from :))
gradle-node-plugin.
That plugin is not used directly because its tasks are not reusable unless the plugin is applied to the project, and I do not want to apply Node Gradle plugin to every project that uses my Scala.js Gradle plugin.
Also, I want to be able to run npm
from within my code without creating tasks.
Also, I would like to be able to use Node available via GraalVM’s polyglot support.
My simplified Node support is under 300 lines.
Test detection produces more information than just the class name:
-
framework that recognized the test
-
fingerprint
-
selectors
I need to deliver this additional information to forked test processors.
For a while, I used modified serializer for this; of course, serializer is hard-coded in the Gradle code, so to use mine I had to modify three Gradle files…
I even made a pull request to add flexibility in this regard to Gradle - but then I realized that I can encode additional information I need to get to the worker in the test class name!
Turns out that IntelliJ Idea integration only works when all the calls to the IJ listener happen from the same thread (it probably uses some thread-local variable to set up cross-process communications). Since some of the calls are caused by the call-back from the sbt testing interface’s event handler, I get "Test events were not received" in the Idea test UI. It would have been nice if this fact was documented somewhere :( I coded an event queue with its own thread, but then discovered that:
-
Gradle provides a mechanism that ensures that all the calls are made from the same thread:
Actor.createActor.getProxy
; -
when tests are forked,
MaxNParallelTestClassProcessor
is used, which already does that, so I do not need to; -
when running on
Scala.js
everything is single-threaded anyway.
org.gradle.internal.remote.internal.hub.DefaultMethodArgsSerializer
seems to make a decision which serializer registry to use based on the
outcome of the SerializerRegistry.canSerialize()
call
for the class of the first parameter of a method;
test id is the first parameter of the TestResultProcessor.output()
, completed()
and failure()
calls.
Without some tricks like registering a serializer for AnyRef
and disambiguating
in the SerializerRegistry.build()
call,
neither null
nor String
are going to work as ids.
This is probably the reason why Gradle:
-
makes all test ids
CompositeIdGenerator.CompositeId
-
registers a
Serializer[CompositeIdGenerator.CompositeId]
inTestEventSerializer
.
Gradle just wants to attract attention to its TestEventSerializer
,
so it registers serializers for the types
of the first parameters of all methods - including the test ids ;)
And since the minimum of composed is two, Gradle uses test ids that are composite of two Longs.
AbstractTestTask installs StateTrackingTestResultProcessor
which keeps track of all tests that are executing in any TestWorker
.
That means that test ids must be scoped per TestWorker
.
Each TestWorker
has an idGenerator
which it uses to generate WorkerTestClassProcessor.workerSuiteId
;
that same idGenerator
can be used to generate sequential ids
for the tests in the worker,
satisfying the uniqueness requirements - and resulting in the test ids always being
a composite of exactly two Longs!
Because tests are scoped by the workers, it does not seem possible to group test results by framework.
Since I can not use the real rootTestSuiteId
that DefaultTestExecuter
supplies to the TestMainAction
- because it is a String
-
and I am not keen on second-guessing what it is anyway,
I use a idPlaceholder
in WorkerTestClassProcessor
and change it to the real one in FixUpRootTestOutputTestResultProcessor
.
Gradle controls the formatting of the test output:
-
indenting is hard-coded in the TestEventLogger.onOutput();
-
addition of the test name and the name of the output stream at the top of each indented batch (output of the same test) is hard-coded in the AbstractTestLogger.logEvent().
IntelliJ Idea, in addTestListener.groovy
:
-
suppresses the output and error events and
-
adds its own test and output listener IJTestEventLogger that does no batching, indenting or adding.
I coded a neat way to test the plugin itself and various features of the various frameworks and their support by the plugin: Feature, Fixture, ForClass, GroupingFunSpec, SourceFile, TestProject.
I coded a neat way to add dependencies dynamically,
Code to do this is in org.podval.tools.build. It can:
-
detect versions of Scala and specific dependencies;
-
add dependencies to configurations;
-
expand the classpath.
This allows the plugin to add dependencies with correct versions and built for correct version of Scala which may be different from the one plugin uses, so that Scala 2.12 can be supported.
Classpath expansion allows the plugin to use classes from dependencies that are added dynamically, but since they become available only after classpath is expanded, they can only be used indirectly; that is why such classes are only mentioned by name in dedicated intermediate classes.
When running on JVM (and not on Scala.js), tests are forked into a separate JVM. Code involved in this is running on the project’s, not the plugin’s, version of Scala.
If the project uses Scala 2.13, Scala 3 classes like scala/runtime/LazyVals$
are missing; this is remedied by adding Scala 3 library to the
worker’s implementation classpath in TestFramework
.
If that version is 2.12, any use of 2.13-exclusive features breaks the code, so I wrote it defensively, to support 2.12 even though the code was compiled by Scala 3. Essentially, I use arrays and my own implementations of the array operations (see Scala212Collections).
Some of the issues:
-
java.lang.NoClassDefFoundError: scala/collection/StringOps$
-
java.lang.NoClassDefFoundError: scala/collection/IterableOnce
-
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps()
-
java.lang.NoSuchMethodError: scala.Predef$.wrapRefArray()
-
java.lang.NoSuchMethodError: scala.collection.immutable.Map.updated()
Among affected are: FrameworkDescriptor, OptionStyle, DryRunSbtTask, RunTestClassProcessor, Fingerprints, Selectors, TaskDefs, TestClassRunForking…
Some of the affected code runs even when using Scala.js, and it works without those compatibility changes; this is probably because within the JVM running Gradle, Scala 2.13 library is on the classpath, even if the project uses Scala 2.12…
I’d rather uglify my code a little than fight with classpath though ;)
Although it is tempting to help the test frameworks out by
filtering tests based on their tags
returned by the test framework in task.tags
, it is:
-
unnecessary, since all the test frameworks plugin supports that support tagging accept arguments that allow them to do the filtering internally;
-
destructive, since none of the test frameworks plugin supports populate
task.tags
, so with explicit tag inclusions, none of the tests run!
Turns out, JUnit4 for Scala.js
and JUnit4 for Scala Native
assume existence of a bootstrapper
in every test class - apparently, because test discovery for JUnit4
is based on annotations, and reflection on Scala.js
and Scala Native
is not powerful enough, so tests are pre-discovered at compile time,
and JUnit4-specific bootstrappers generated for them.
Without bootstrappers, we get errors like:
Error while loading test class ... failed:
java.lang.ClassNotFoundException: Cannot find ...$scalajs$junit$bootstrapper$
For Scala.js
on Scala 3, bootstrappers are generated by the Scala.js
compiler;
for Scala.js
on Scala 2, and always for Scala Native
,
to get the bootsrappers generated,
a dedicated Scala compiler plugin has to be added:
for Scala.js - org.scala-js:scalajs-junit-test-plugin
,
for Scala Native - org.scala-native:junit-plugin
.
This compiler plugin can only be added when JUnit4
is actually on the classpath - or Scala compiler breaks ;)
It thus be added only to the test Scala compilation and not to the main one;
since plugins added to the scalaCompilerPlugins
configuration affect both
the test and the main Scala compilations,
plugin creates a separate configuration testScalaCompilerPlugins
just for this one plugin
(even when the JVM backend, that does not need, is used) ;)
see:
Plugin needs to associate a test framework and a fingerprint with each test class, so it uses its own test detector.
This is why file-name based test scan is not supported
(isScanForTestClasses
must be at its default value true
):
name of the test class is not sufficient to determine which test framework
the class belongs to.
This is also why JUnit5
is not supported:
it insists on discovering the tests itself, as a comment on the JupiterTestFingerprint.annotationName()
says:
return The name of this class. This is to ensure that SBT does not find any tests so that we can use JUnit Jupiter’s test discovery mechanism.
Well, mission accomplished: my test detector does not find any tests either.
Originally, I coded a test detection mechanism that used
analysis file generated by the Scala compiler.
This code was later replaced with a traditional mechanism
based on scanning the class files,
similar to the mechanism used by Gradle for test detection with JUnit4
and TestNG
.
If a class file is recognized by more than one framework
(e.g. MUnit
tests, which are also JUnit4
tests),
it is attributed to the framework whose fingerprint is closer to
the test class in the hierarchy (e.g. MUnit
).
If a test class is encountered with more than one framework claiming it at the same distance in the hierarchy (which does not happen naturally, but can be constructed), mistake is assumed, a warning is issued, and the class is ignored.
On Scala.js
, annotation are not available at runtime
(Scala.js compiler does not add RuntimeVisibleAnnotations
to the class file),
so this mechanism alone does not detect tests that are marked as such
using annotations.
Currently, the only test framework that marks tests as tests using annotations
is JUnit4 for Scala.js
.
When JUnit4 for Scala.js
is on the classpath,
for each test class candidate
plugin looks for the bootstrapper left behind by the Scala.js compiler
(or, on Scala 2, Scala compiler plugin that generates bootstrappers).
Presence of a bootstrapper TestClass$scalajs$junit$bootstrapper$
is treated as a presence of the @Test
annotation on TestClass
,
which marks it as a test belonging to the JUnit4 for Scala.js
test framework.
sbt
test interface allows test framework to return nested tasks
when executing a task;
of the test frameworks supported by the plugin,
only ScalaCheck
uses this mechanism:
it returns test cases of the test class being executed
as nested tasks (with TestSelector
).
All other frameworks run the test cases directly and report the results via event handler; what selector is reported depends on the test framework:
-
most test frameworks use
TestSelector
; -
uTest
usesNestedTestSelector
; -
ScalaTest
usesNestedTestSelector
for test cases from the nested suites; -
JUnit4
,JUnit4 for Scala.js
andMUnit
useTestSelector
even for test cases from the nested suites, but they prepend the name of the class to the test case name (both in the selector and in the event’sfullyQualifiedName
); plugin makes sure to attribute test cases to the correct test classes.
To stop tests from being forked - which is needed to run tests on Scala.js -
I had to fork org.gradle.api.internal.tasks.testing.detection.DefaultTestExecuter
(see DefaultTestExecuter).
This is suboptimal, since I now have to track changes to the forked class.
My proposal to expose an extension point that would allow to avoid the fork was rejected:
32666,
32656;
that made it pretty clear that other modifications to Gradle that would make my code
cleaner would be to, so I did not even bother;
here are examples of resulting ugliness:
-
to add to the implementation class path of
WorkerProcessBuilder
, I had to use reflection in SbtTestFramework; -
to set test framework on the test task, I had to use reflection in TestTask;
-
to set options on the test framework, I copied
org.gradle.api.tasks.testing.Test.options
: it is private and too short to bother with reflection; -
to call
ForkedTestClasspath.getApplicationClasspath()
I had to use reflection, since it returnsorg.gradle.internal.impldep.com.google.common.collect.ImmutableList
, which is not accessible from the plugin and results injava.lang.NoSuchMethodError
; -
since Gradle’s internal copy of
org.ow2.asm:asm
is underimpldep
and is not accessible to the plugin, I had to add an explicit dependency onorg.ow2.asm:asm
; -
org.gradle.api.tasks.testing.Test.testsAreNotFiltered()
callsTest.noCategoryOrTagOrGroupSpecified()
, which recognizes only the test frameworks explicitly supported by Gradle (JUnit
andTestNG
); since I can not override it, I just useorg.gradle.api.tasks.testing.junit.JUnitOptions
asSbtTestFrameworkOptions
.
My original approach was to use Gradle’s features to scope source sets and tasks
belonging to different backends within the same project;
this was implemented in the unpublished version 0.7.9
.
This approach was deemed too complicated to use (and too fragile an implementation) and was replaced with the current approach where backend-specific entities are scoped by backend-specific subprojects.
TODO describe the "duplicate content roots" issue and my solution; speculate about the approach of the IntelliJ Scala plugin:
https://youtrack.jetbrains.com/issue/IDEABKL-6745/Cannot-define-two-identical-content-roots-in-different-module-within-a-single-project - Scala Plugin for IntelliJ IDEA; - org.jetbrains.sbt.project.sources.SharedSourcesModuleType - its description: "During compilation, dependency to a shared sources module mixes in module sources rather than module output"
GitHub stupidly disables AsciDoc includes in README; see the discussion.
One include (of the versions.adoc
in README.adoc
)
is not enough to bother with AsciiDoctor Reducer,
so I just patch the Readme.adoc…
I also write versions to gradle.properties
and use them in gradle.build
.
I want to thank the maintainers of:
-
sbt test framework implementation for JUnit4;
I want to thank:
-
maiflai for the ScalaTest Gradle plugin;
-
gtache for the existing Scala.js Gradle plugin;
-
srs for the Node.js Gradle Plugin;
-
gzm0 for the Stack Overflow answer that was extremely helpful for understanding how the Scala.js linker should be called;
-
zstone1 for the encouragement and for requesting basic testing functionality;
-
machaval for the encouragement, for requesting support for Scala 2.12 and for helping me understand the limits of such support;
-
qwqawawow for a bug report;
-
a01fe for a bug report;
-
kyri-petrou for helping me fix ZIO Test’s treatment of test wildcards and test name matching;
-
hearnadam for approving my fix to ZIO Test’s test name matching;
-
sjrd for the helpful text Implementing Scala.JS Support for Scala 3 and for working with me on fixing issues with JUnit4 for Scala.js reporting of test failure throwable and test duration;
-
cheeseng for helping me understand the problem with running nested ScalaTest suites using my plugin;
-
tgodzik for accepting my fix for MUnit’s reporting of test duration;
-
etorreborre for accepting my fix for specs2’s treatment of test wildcards;
-
ekrich for encouraging my fix for JUnit4 for Scala Native reporting of test failure throwable and duration for Scala Native;
-
WojciechMazur for pointing me towards Mill code for Scala Native, for accepting my fix to the test failure throwable and duration for Scala Native, for adding
scala.scalanative.build.Build.buildCachedAwait()
method upon my request, and for accepting my fix to a spurious dependency issue; -
LeeTibbert for encouraging my typo fixes to Scala Native;