Skip to content

[AURON #2013] Auto-lock Scala version in Spark profiles.#2014

Open
slfan1989 wants to merge 1 commit intoapache:masterfrom
slfan1989:auron-2013
Open

[AURON #2013] Auto-lock Scala version in Spark profiles.#2014
slfan1989 wants to merge 1 commit intoapache:masterfrom
slfan1989:auron-2013

Conversation

@slfan1989
Copy link
Contributor

@slfan1989 slfan1989 commented Feb 17, 2026

Which issue does this PR close?

Closes #2013

Rationale for this change

Currently, when developing in IntelliJ IDEA, users need to manually select both a Spark profile (e.g., spark-3.5) and a Scala profile (e.g., scala-2.12) to ensure correct compilation. This two-step configuration is error-prone and can lead to compatibility issues if mismatched.

This change simplifies the workflow by automatically locking the Scala version and compiler configuration when a Spark profile is selected.

What changes are included in this PR?

  1. Scala version properties:
  • Spark 3.x profiles: scalaVersion=2.12, scalaLongVersion=2.12.18
  • Spark 4.x profiles: scalaVersion=2.13, scalaLongVersion=2.13.17
  1. scala-maven-plugin configuration:
  • Spark 3.x: Added semanticdb-scalac + paradise compiler plugins
  • Spark 4.x: Added -Ymacro-annotations args + semanticdb-scalac plugin (without paradise)
  • Used combine.self="override" to prevent configuration merging with base settings

Are there any user-facing changes?

Yes. Users now only need to select a Spark profile (e.g., spark-3.5) in IDEA, and the corresponding Scala version will be automatically configured. No need to manually select separate scala-2.12 or scala-2.13 profiles.

How was this patch tested?

Verified in IntelliJ IDEA by:

  1. Selecting only spark-3.5 profile → Scala 2.12.18 correctly configured
  2. Selecting only spark-4.0 profile → Scala 2.13.17 correctly configured
  3. No need to select separate scala-2.12 or scala-2.13 profiles

Signed-off-by: slfan1989 <slfan1989@apache.org>
@github-actions github-actions bot added the build label Feb 17, 2026
@slfan1989
Copy link
Contributor Author

  • When we select Spark 3.5, we can see that IntelliJ IDEA automatically switches to it.
Image
  • When we select Spark 4.1, we can see that IntelliJ IDEA automatically switches to it.
Image

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR simplifies IntelliJ IDEA development workflow by auto-locking Scala versions within Spark profiles, eliminating the need to manually select separate Scala profiles. When a Spark profile is activated, the corresponding Scala version and compiler configuration are now automatically set.

Changes:

  • Added scalaVersion and scalaLongVersion properties to all Spark profiles (3.0-3.5 use Scala 2.12.18; 4.0-4.1 use Scala 2.13.17)
  • Configured scala-maven-plugin in each Spark profile with appropriate compiler plugins (semanticdb + paradise for Spark 3.x; semanticdb with -Ymacro-annotations for Spark 4.x)
  • Used combine.self="override" to prevent configuration merging with base settings

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +1027 to +1040
<args combine.self="override">
<arg>-Ymacro-annotations</arg>
<arg>-Wconf:cat=deprecation:wv,any:e</arg>
<arg>-Wconf:cat=other-nullary-override:s</arg>
<arg>-Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s</arg>
<arg>-Wconf:msg=Auto-application to \`\(\)\` is deprecated:s</arg>
<arg>-Wconf:msg=object JavaConverters in package collection is deprecated:s</arg>
<arg>-Wconf:cat=unchecked&amp;msg=outer reference:s</arg>
<arg>-Wconf:cat=unchecked&amp;msg=eliminated by erasure:s</arg>
<arg>-Wconf:cat=unused-nowarn:s</arg>
<arg>-Wconf:msg=early initializers are deprecated:s</arg>
<arg>-Wconf:cat=other-match-analysis:s</arg>
<arg>-Wconf:cat=feature-existentials:s</arg>
</args>
Copy link

Copilot AI Feb 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The args configuration in the spark-4.0 profile is missing several compiler arguments that are present in the scala-2.13 profile (lines 1227-1250). Specifically missing:

  • -deprecation
  • -feature
  • -Ywarn-unused
  • -Xfatal-warnings
  • -Wconf:msg=method newInstance in class Class is deprecated:s
  • -Wconf:msg=class ThreadDeath in package lang is deprecated:s

These should be included to maintain consistency with the scala-2.13 profile that would have been used previously.

Copilot uses AI. Check for mistakes.
Comment on lines +1098 to +1111
<args combine.self="override">
<arg>-Ymacro-annotations</arg>
<arg>-Wconf:cat=deprecation:wv,any:e</arg>
<arg>-Wconf:cat=other-nullary-override:s</arg>
<arg>-Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s</arg>
<arg>-Wconf:msg=Auto-application to \`\(\)\` is deprecated:s</arg>
<arg>-Wconf:msg=object JavaConverters in package collection is deprecated:s</arg>
<arg>-Wconf:cat=unchecked&amp;msg=outer reference:s</arg>
<arg>-Wconf:cat=unchecked&amp;msg=eliminated by erasure:s</arg>
<arg>-Wconf:cat=unused-nowarn:s</arg>
<arg>-Wconf:msg=early initializers are deprecated:s</arg>
<arg>-Wconf:cat=other-match-analysis:s</arg>
<arg>-Wconf:cat=feature-existentials:s</arg>
</args>
Copy link

Copilot AI Feb 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The args configuration in the spark-4.1 profile is missing several compiler arguments that are present in the scala-2.13 profile (lines 1227-1250). Specifically missing:

  • -deprecation
  • -feature
  • -Ywarn-unused
  • -Xfatal-warnings
  • -Wconf:msg=method newInstance in class Class is deprecated:s
  • -Wconf:msg=class ThreadDeath in package lang is deprecated:s

These should be included to maintain consistency with the scala-2.13 profile that would have been used previously.

Copilot uses AI. Check for mistakes.
Comment on lines +790 to +802
<compilerPlugins combine.self="override">
<compilerPlugin>
<groupId>org.scalameta</groupId>
<artifactId>semanticdb-scalac_${scalaLongVersion}</artifactId>
<version>${semanticdb.version}</version>
</compilerPlugin>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_${scalaLongVersion}</artifactId>
<version>${scalamacros.paradise.version}</version>
</compilerPlugin>
</compilerPlugins>
</configuration>
Copy link

Copilot AI Feb 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The scala-2.12 profile (lines 1177-1207) will automatically activate when scalaVersion=2.12 due to its activation condition. Since all Spark 3.x profiles now set this property, both the Spark profile and the scala-2.12 profile will be active simultaneously.

While the use of combine.self="override" in the Spark profiles should prevent configuration merging, this creates a potential maintenance issue and could lead to unexpected behavior if the override attribute is removed or if Maven's profile merging behavior changes. Consider either:

  1. Removing the scala-2.12 and scala-2.13 profiles entirely if they are no longer needed
  2. Modifying their activation conditions to prevent conflicts with Spark profiles

Copilot uses AI. Check for mistakes.
Comment on lines 784 to 977
@@ -789,7 +813,31 @@
<sparkVersion>3.1.3</sparkVersion>
<shortSparkVersion>3.1</shortSparkVersion>
<nettyVersion>4.1.51.Final</nettyVersion>
<scalaVersion>2.12</scalaVersion>
<scalaLongVersion>2.12.18</scalaLongVersion>
</properties>
<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<configuration>
<compilerPlugins combine.self="override">
<compilerPlugin>
<groupId>org.scalameta</groupId>
<artifactId>semanticdb-scalac_${scalaLongVersion}</artifactId>
<version>${semanticdb.version}</version>
</compilerPlugin>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_${scalaLongVersion}</artifactId>
<version>${scalamacros.paradise.version}</version>
</compilerPlugin>
</compilerPlugins>
</configuration>
</plugin>
</plugins>
</build>
</profile>

<profile>
@@ -800,7 +848,31 @@
<sparkVersion>3.2.4</sparkVersion>
<shortSparkVersion>3.2</shortSparkVersion>
<nettyVersion>4.1.68.Final</nettyVersion>
<scalaVersion>2.12</scalaVersion>
<scalaLongVersion>2.12.18</scalaLongVersion>
</properties>
<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<configuration>
<compilerPlugins combine.self="override">
<compilerPlugin>
<groupId>org.scalameta</groupId>
<artifactId>semanticdb-scalac_${scalaLongVersion}</artifactId>
<version>${semanticdb.version}</version>
</compilerPlugin>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_${scalaLongVersion}</artifactId>
<version>${scalamacros.paradise.version}</version>
</compilerPlugin>
</compilerPlugins>
</configuration>
</plugin>
</plugins>
</build>
</profile>

<profile>
@@ -811,7 +883,31 @@
<sparkVersion>3.3.4</sparkVersion>
<shortSparkVersion>3.3</shortSparkVersion>
<nettyVersion>4.1.74.Final</nettyVersion>
<scalaVersion>2.12</scalaVersion>
<scalaLongVersion>2.12.18</scalaLongVersion>
</properties>
<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<configuration>
<compilerPlugins combine.self="override">
<compilerPlugin>
<groupId>org.scalameta</groupId>
<artifactId>semanticdb-scalac_${scalaLongVersion}</artifactId>
<version>${semanticdb.version}</version>
</compilerPlugin>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_${scalaLongVersion}</artifactId>
<version>${scalamacros.paradise.version}</version>
</compilerPlugin>
</compilerPlugins>
</configuration>
</plugin>
</plugins>
</build>
</profile>

<profile>
@@ -822,7 +918,31 @@
<sparkVersion>3.4.4</sparkVersion>
<shortSparkVersion>3.4</shortSparkVersion>
<nettyVersion>4.1.87.Final</nettyVersion>
<scalaVersion>2.12</scalaVersion>
<scalaLongVersion>2.12.18</scalaLongVersion>
</properties>
<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<configuration>
<compilerPlugins combine.self="override">
<compilerPlugin>
<groupId>org.scalameta</groupId>
<artifactId>semanticdb-scalac_${scalaLongVersion}</artifactId>
<version>${semanticdb.version}</version>
</compilerPlugin>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_${scalaLongVersion}</artifactId>
<version>${scalamacros.paradise.version}</version>
</compilerPlugin>
</compilerPlugins>
</configuration>
</plugin>
</plugins>
</build>
</profile>

<profile>
@@ -833,7 +953,31 @@
<sparkVersion>3.5.8</sparkVersion>
<shortSparkVersion>3.5</shortSparkVersion>
<nettyVersion>4.1.96.Final</nettyVersion>
<scalaVersion>2.12</scalaVersion>
<scalaLongVersion>2.12.18</scalaLongVersion>
</properties>
<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<configuration>
<compilerPlugins combine.self="override">
<compilerPlugin>
<groupId>org.scalameta</groupId>
<artifactId>semanticdb-scalac_${scalaLongVersion}</artifactId>
<version>${semanticdb.version}</version>
</compilerPlugin>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_${scalaLongVersion}</artifactId>
<version>${scalamacros.paradise.version}</version>
</compilerPlugin>
</compilerPlugins>
</configuration>
Copy link

Copilot AI Feb 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is significant duplication of the scala-maven-plugin configuration across all six Spark 3.x profiles (spark-3.0 through spark-3.5). Each profile contains identical compiler plugin configurations (semanticdb-scalac and paradise).

Consider extracting this common configuration into a shared parent profile or property to reduce duplication and simplify maintenance. For example, create a spark-3.x parent profile that all Spark 3.x profiles inherit from.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Auto-lock Scala version in Spark profiles

2 participants