Skip to content

Conversation

andy-hf-kwok
Copy link
Contributor

Which issue does this PR close?

Partially closes #2255

Rationale for this change

What changes are included in this PR?

  • Disable -Xlint:_ and -Ywarn-dead-code as a huge of refactor / package relocation will be required.
  • Address all numeric-widen issues
  • Import re-order
  • Replace map.put with update to have no return (Side effect only)
  • Replace the reference of MapStatus with AnyRef, to avoid direct reference to package private member.

How are these changes tested?

mvn install -pl :comet-spark-spark3.5_2.12 -Pstrict-warnings

And make sure mvn can produce the artifact with all test passed.

Signed-off-by: Andy HF Kwok <[email protected]>

# Conflicts:
#	spark/src/main/scala/org/apache/comet/Native.scala
Signed-off-by: Andy HF Kwok <[email protected]>
Signed-off-by: Andy HF Kwok <[email protected]>
Signed-off-by: Andy HF Kwok <[email protected]>
Signed-off-by: Andy HF Kwok <[email protected]>
Signed-off-by: Andy HF Kwok <[email protected]>
Signed-off-by: Andy HF Kwok <[email protected]>
Signed-off-by: Andy HF Kwok <[email protected]>
Signed-off-by: Andy HF Kwok <[email protected]>
Signed-off-by: Andy HF Kwok <[email protected]>
Signed-off-by: Andy HF Kwok <[email protected]>
inputs: Seq[Attribute],
binding: Boolean,
conf: SQLConf): Option[ExprOuterClass.AggExpr] = {
@unused conf: SQLConf): Option[ExprOuterClass.AggExpr] = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this is unused can we just remove it?

partitionBuilder.addPartitionedFile(fileBuilder.build())
})
nativeScanBuilder.addFilePartitions(partitionBuilder.build())
nativeScanBuilder.addFilePartitions(partitionBuilder.build()); ()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems incomprehensible.

ExprOuterClass.Expr
.newBuilder()
.setBound(boundExpr)
.build()))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if this formatting error is intended ?

<arg>-Xlint:_</arg>
UnsupportedException being thrown on SparkPlan default.
<arg>-Ywarn-dead-code</arg>
-->
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if this comment is necessary ? or may be I am missing something here

def isSparkVersionAtLeast355: Boolean = {
VersionUtils.majorMinorPatchVersion(SPARK_VERSION_SHORT) match {
case Some((major, minor, patch)) => (major, minor, patch) >= (3, 5, 5)
case Some((major, minor, patch)) => (major, minor, patch) >= ((3, 5, 5))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unintended formatting issue ?


import org.apache.hadoop.fs.Path

import org.apache.spark.sql.SparkSession
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unintended formatting issue ?

logInfo(s"Setting $extensionKey=$extensionClass")
conf.set(extensionKey, extensionClass)
conf.set(extensionKey, extensionClass); ()
} else {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps an unintended () ?

override protected def doPrepare(): Unit = {
// Materialize the future.
relationFuture
relationFuture; ()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps an unintended () ?

var mapStatus: MapStatus = _
// Store MapStatus opaquely as AnyRef,
// to avoid private[spark] visibility issues; cast back when needed.
var mapStatus: AnyRef = _
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason why we are removing type assignment of AnyRef ?

@coderfender
Copy link
Contributor

@andy-hf-kwok , Thank you for the PR . I would recommend you to run make format and make release to hash out unwanted formatting / compile issues .

Thank you

@codecov-commenter
Copy link

codecov-commenter commented Oct 13, 2025

Codecov Report

❌ Patch coverage is 70.96774% with 9 lines in your changes missing coverage. Please review.
✅ Project coverage is 41.79%. Comparing base (f09f8af) to head (7003584).
⚠️ Report is 596 commits behind head on main.

Files with missing lines Patch % Lines
...park/src/main/scala/org/apache/spark/Plugins.scala 0.00% 2 Missing ⚠️
...e/spark/sql/comet/CometBroadcastExchangeExec.scala 0.00% 2 Missing ⚠️
...pache/spark/sql/comet/CometColumnarToRowExec.scala 0.00% 2 Missing ⚠️
...n/scala/org/apache/comet/rules/CometScanRule.scala 0.00% 1 Missing ⚠️
...apache/spark/sql/comet/CometCollectLimitExec.scala 0.00% 1 Missing ⚠️
...ark/sql/comet/CometTakeOrderedAndProjectExec.scala 0.00% 1 Missing ⚠️
Additional details and impacted files
@@              Coverage Diff              @@
##               main    #2558       +/-   ##
=============================================
- Coverage     56.12%   41.79%   -14.33%     
- Complexity      976     1105      +129     
=============================================
  Files           119      147       +28     
  Lines         11743    13642     +1899     
  Branches       2251     2369      +118     
=============================================
- Hits           6591     5702      -889     
- Misses         4012     6974     +2962     
+ Partials       1140      966      -174     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@andy-hf-kwok
Copy link
Contributor Author

Hi @coderfender @wForget , thx for the review.
After going through your comments, I realized that fixing all warnings in a single PR might not be the best approach — it could get quite complex.
Instead, I’m planning to split this effort into multiple PRs, with each one addressing a specific warning option individually.
Below is the first one which aim to address to numeric widen issue.
#2588

Thanks,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

chore: Fix Scala code warnings

4 participants