-
Couldn't load subscription status.
- Fork 13.7k
[FLINK-5005] WIP: publish scala 2.12 artifacts #3703
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@DieBauer thanks for taking this on! I haven't been using Flink with Scala but I think this will be important to have for the May release. The required changes for type inferences are interesting. I'm puzzled why this would regress. Also, if developers are writing against 2.10 then these issues will not manifest until integration tests are run (the same problem you are experiencing). One other thought: since Scala 2.12 requires Java 8, is it still necessary to specify Flink Forward starts Monday so developer activity will be low this week. @StephanEwen thoughts when you have the chance? |
|
I'm running into an issue with asm in for example the flink-scala module, when compiling with 2.12. since the closurecleaner was initially copied from spark, I've looked there and found an issues (apache/spark#9512) regarding asm5 and java8. and ours is from org.ow2.asm, asm. There are things going on in the shaded plugin in the parent pom with regard to relocating dependencies of asm, but I'm not sure how that all works out. So for now, I'm a bit puzzled why we get this error. @greghogan you're right, the profile jdk8 is only enabling the module with examples in java8. But since they are also compiled in the scala-2.11 case, I thought we want to have them? We can drop it of course. |
|
Ok, started looking into the issue a bit more. From the scala 2.12-M3 release notes (https://github.com/scala/scala/releases/tag/v2.12.0-M3):
Our ClosureCleaner uses the class name for instantiating the ClassReader, which is used later on. However, since scala2.12 doesn't generate anonymous classes, the file isn't found (null), therefore we get class not found exception, which make sense now. We have to look into how to circumvent/implement this new generation of 'lambdas'. A small technical example, the testclass which throwed an exception when compiling/executing with scala 2.11 However when using scala 2.12 Concluding, according to me, with the new scala 2.12 style with lambdas, the current closurecleaner doesn't suffice. Any thoughts how to proceed? |
…/scala-2.12 # Conflicts: # pom.xml
[ERROR] /Users/jens/Development/flink/flink-tests/src/test/java/org/apache/flink/test/javaApiOperators/GroupCombineITCase.java:54: error: not found: type TestExecutionMode
[INFO] public GroupCombineITCase(TestExecutionMode mode) {
maybe related to scala/bug#10207 ?
java.lang.IllegalStateException: Failed to transform class with name scala.concurrent.duration.Duration. Reason: javassist.bytecode.InterfaceMethodrefInfo cannot be cast to javassist.bytecode.MethodrefInfo http://stackoverflow.com/questions/31189086/powermock-and-java-8-issue-interfacemethodrefinfo-cannot-be-cast-to-methodrefin#37217871
88ea483 to
ec4fb48
Compare
|
One thing you can try and do is to run @twalthr may have some thoughts on that as well... |
|
We cannot add any other dependencies to the pom files. Adding "akka" back will create a conflict with the "flakka" files. What we can do is wither of the following two options:
We can also not add more Travis build profiles (builds take too long already). We need to keep that number as it is and simply select one of these profiles to use Scala 2.12 rather than for example 2.10. |
|
Going into the direction of dropping flakka would be a good win for users of Flink like us that are dealing everyday with SBT/Maven hacking to be able to use vanilla Akka and Flink. Such a pain... |
|
@Joan - I actually agree with you. We needed to use "flakka" to be able to support Java 7 and bind to a wildcard address (across interfaces). Would be great to be able to do that differently and not have a custom akka build (at least for Java 8 / Scala 2.11 / 2.12) |
We are in 2017 |
|
There is about 10% Flink users on Java 7 (we did a poll recently). |
|
@joan38 I have some WIP for a flag that allows you to use vanilla akka when running on Java 8, Scala 2.11 Here is the branch: https://github.com/StephanEwen/incubator-flink/commits/vanilla_akka You can try to build it via: |
|
@StephanEwen Nice! That looks promising. |
|
where can i watch the status page-thing.. that tells us when we can get off of old 2.11 ? |
|
@frankbohman just watch the JIRA issue: https://issues.apache.org/jira/browse/FLINK-5005 |
|
Any news on this? |
|
@joan38 there has been a discussion on the mailing list about dropping Java 7 support (no one has objected) which will make it simpler to support Scala 2.12 in the upcoming release. |
|
@greghogan That's a pretty good news. |
|
We are really looking forward to this 👍 |
|
Is there any news on this? |
|
So I guess this PR is abandoned? |
|
@DieBauer Do you still wan't to work on this? I also started trying to make Flink ready for 2.12 before I noticed this older branch. I'd be very happy to stop, though, if you're interested in bringing this to an end. It should be easier now that we dropped Java 8 support and also agreed to drop Scala 2.10 support. |
|
Hi, I'm sorry for the late reaction. I haven't found the time to work on this anymore (also priorities shifted... ) Therefore this pull request is stale. (it still could be used as a reference). I think the main challenge is in serialising the java8 lambdas. And dropping the support for scala 2.10 and Java7 certainly helps in taming the pom.xml profiles. I will close this pull request to not keep the hopes up. |
General
Documentation
Tests & Build
mvn clean verifyhas been executed successfully locally or a Travis build has passedThis is an initial approach to make flink scala 2.12 ready.
I've introduced profiles to switch between 2.12, 2.11 and 2.10. All three profiles now compile.
mvn clean install -D$versionwhere $version isscala-2.12,scala-2.11orscala-2.10.To overcome the
flakkaartifacts (akka2.3-custom) for scala 2.12, I've replaced them with the latest typesafe-akka artifacts when using the 2.12 profile.TravisCI profiles are added and I've changed the initial release script to accomodate for 2.12, but this is by no means finished.
I encountered a lot of compilation errors, because types could not be inferred. Therefore I've added types to problematic expressions.
The kafka 0.10 dependency is bumped to 0.10.1.1 since that's the first released version for 2.12.
There is some trickery in the connector-parent-pom because only kafka-0.10 is released for 2.12, kafka-0.9 and kafka-0.8 aren't compiled for 2.12. I've to look into that a little more.
More updated dependencies:
javassist was bumped because of java.lang.IllegalStateException: Failed to transform class with name scala.concurrent.duration.Duration. Reason: javassist.bytecode.InterfaceMethodrefInfo cannot be cast to javassist.bytecode.MethodrefInfo. which led me to: http://stackoverflow.com/questions/31189086/powermock-and-java-8-issue-interfacemethodrefinfo-cannot-be-cast-to-methodrefin#37217871
twitter-chill was bumped to 0.7.7 version for cross-compiled versions.
grizzled slf4j was bumped for scala 2.12 to version 1.3.0.
scalatest was bumped for scala 2.12 to version 3.0.1
Right now I'm trying to make the travis build succeed.
I will squash all commits once this succeeds.
Any other suggestions are welcome!