Skip to content

[SPARK-28939][SQL][FOLLOWUP] Fix JDK11 compilation due to ambiguous reference #25738

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed

Conversation

dongjoon-hyun
Copy link
Member

@dongjoon-hyun dongjoon-hyun commented Sep 9, 2019

What changes were proposed in this pull request?

This PR aims to recover the JDK11 compilation with a workaround.
For now, the master branch is broken like the following due to a Scala bug which is fixed in 2.13.0-RC2.

[ERROR] [Error] /spark/sql/core/src/main/scala/org/apache/spark/sql/execution/SQLExecutionRDD.scala:42: ambiguous reference to overloaded definition,
both method putAll in class Properties of type (x$1: java.util.Map[_, _])Unit
and  method putAll in class Hashtable of type (x$1: java.util.Map[_ <: Object, _ <: Object])Unit
match argument types (java.util.Map[String,String])

Why are the changes needed?

This workaround recovers JDK11 compilation.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Manual build with JDK11 because this is JDK11 compilation fix.

  • Jenkins builds with JDK8 and tests with JDK11.
  • GitHub action will verify this after merging.

@dongjoon-hyun
Copy link
Member Author

dongjoon-hyun commented Sep 9, 2019

cc @mgaido91 , @cloud-fan , @hvanhovell , @tgravescs , @viirya , @srowen .

@SparkQA
Copy link

SparkQA commented Sep 10, 2019

Test build #110377 has finished for PR 25738 at commit 54b44c7.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dongjoon-hyun
Copy link
Member Author

Thank you for review and approval, @srowen , @viirya , @HyukjinKwon , @maropu , @cloud-fan .
Merged to master.

@dongjoon-hyun dongjoon-hyun deleted the SPARK-28939 branch September 10, 2019 03:35
@@ -39,7 +39,7 @@ class SQLExecutionRDD(
private val sqlConfigs = conf.getAllConfs
private lazy val sqlConfExecutorSide = {
val props = new Properties()
props.putAll(sqlConfigs.asJava)
sqlConfigs.foreach { case (k, v) => props.setProperty(k, v) }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know it is a bit late, but why are we not setting the configurations directly?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mgaido91 any idea?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the reason why I created the properties object was to have it in a synchronized block, but since we are in a lazy val it is probably useless...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mgaido91 can you send a followup PR to clean it up? Now we can't set the SQL confs in a batch and I don't think it's useful to have this Properties.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hvanhovell and @mgaido91 . Feel free to make a follow-up if we need. Also, please apply that in the backporting PR, #25734, before merging for consistency.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure @cloud-fan , I am doing that. Thanks for the remainder @dongjoon-hyun, I'll update the backport PR once this is stable. Thanks.

PavithraRamachandran pushed a commit to PavithraRamachandran/spark that referenced this pull request Sep 15, 2019
…eference

### What changes were proposed in this pull request?

This PR aims to recover the JDK11 compilation with a workaround.
For now, the master branch is broken like the following due to a [Scala bug](scala/bug#10418) which is fixed in `2.13.0-RC2`.
```
[ERROR] [Error] /spark/sql/core/src/main/scala/org/apache/spark/sql/execution/SQLExecutionRDD.scala:42: ambiguous reference to overloaded definition,
both method putAll in class Properties of type (x$1: java.util.Map[_, _])Unit
and  method putAll in class Hashtable of type (x$1: java.util.Map[_ <: Object, _ <: Object])Unit
match argument types (java.util.Map[String,String])
```

- https://github.com/apache/spark/actions (JDK11 build monitoring)

### Why are the changes needed?

This workaround recovers JDK11 compilation.

### Does this PR introduce any user-facing change?

No.

### How was this patch tested?

Manual build with JDK11 because this is JDK11 compilation fix.
- Jenkins builds with JDK8 and tests with JDK11.
- GitHub action will verify this after merging.

Closes apache#25738 from dongjoon-hyun/SPARK-28939.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants