Skip to content

Simplify creations of separate benchmark source sets in multiplatform projects #291

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
fzhinkin opened this issue Feb 11, 2025 · 7 comments
Labels
enhancement New feature or request

Comments

@fzhinkin
Copy link
Collaborator

Documentation describes a way to place benchmarks in separate source sets. It kinda works when you need a single source set associated with a particular target. However, creating a separate common benchmark source set where benchmarks could be executed for each and every target is quite difficult.

It would be nice to provide an extension function for Project, that'll do that itself, instead of delegating this tedious work to users.

@fzhinkin fzhinkin added the enhancement New feature or request label Feb 11, 2025
@fzhinkin
Copy link
Collaborator Author

This is how "manual" configuration looks like now:

kotlin {
    sourceSets {
        val benchmarkMain by creating {
            dependencies {
                implementation("org.jetbrains.kotlinx:kotlinx-benchmark-runtime:0.4.13")
            }
        }
    }

    val bmm = sourceSets.getByName("benchmarkMain")

    targets.matching { it.name != "metadata" }.all {
        compilations.create("benchmark") {
            associateWith(this@all.compilations.getByName("main"))
            defaultSourceSet {
                dependencies {
                    implementation("org.jetbrains.kotlinx:kotlinx-benchmark-runtime:0.4.13")
                }
                dependsOn(bmm)
            }
        }
    }

    targets.matching { it.name != "metadata" }.all {
        benchmark.targets.register("${name}Benchmark")
    }
}

Yes, it could be cleaned up and simplified. Yet, it's hard to come up with something like that unless you spend enough time writing and debugging build scripts.

@lppedd
Copy link

lppedd commented Feb 18, 2025

A good default for the plugin could be creating source sets based on active platforms.
Example with js() and jvm():

commonMain
commonTest
commonBenchmark
jsMain
jsTest
jsBenchmark
jvmMain
jvmTest
jvmBenchmark

@cubuspl42
Copy link

cubuspl42 commented Apr 23, 2025

I understand the idea of the word "benchmark" appearing in the second spot of this magical tuple. commonMain is the primary code, jvmMain is JVM-specific primary code (depends on commonMain), commonBenchmark is cross-platform benchmarking code, jvmBenchmark is JVM-specific benchmarking code (depends on commonBenchmark), etc...

But what's the role of benchmarkMain? Isn't commonBenchmark a good-enough greatest common ancestor of all benchmarking source sets?

@fzhinkin
Copy link
Collaborator Author

Yep, <srcset name>Benchmark seems to be a much better alternative to benchmarkMain.

@cubuspl42
Copy link

It's what you mentioned in the last comment. Do you know how could we "polyfill" exactly this setup?

@cubuspl42
Copy link

cubuspl42 commented Apr 23, 2025

I found an official example for shared cross-platform benchmark code, but it's in commonMain, not including an optional sourceset separation for benchmarking code.

@fzhinkin
Copy link
Collaborator Author

Do you know how could we "polyfill" exactly this setup?

@cubuspl42, I guess something like #291 (comment) but with different source set names.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants