-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-19810][BUILD][CORE] Remove support for Scala 2.10 #17150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
R/pkg/DESCRIPTION
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(Will fix this -- need to probably make the build stop changing this)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This and the SequenceFileRDDFunctions change are the only questionable ones. This should be the equivalent declaration that context bounds desugar into, but the question is whether this actually induces an obscure API change.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These didn't need to be context bounds in the first place
dev/change-version-to-2.11.sh
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These scripts were deprecated wrappers
|
Test build #73843 has finished for PR 17150 at commit
|
|
The context bound changes won't work on the two public methods: Quite unfortunately, they generate signatures that take the same arguments but in differing orders. This bit is one of those tiny breaking changes that will have to wait for Spark 3.x. As I recall they still work in Scala 2.12, even though they are deprecated. |
|
Test build #73899 has finished for PR 17150 at commit
|
|
Test build #73930 has finished for PR 17150 at commit
|
bin/load-spark-env.cmd
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I commented bits like this out, but updated it as well for when 2.12 needs to be supported as an alternative
dev/create-release/release-build.sh
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was redundant with the line 8 lines above
|
Test build #73994 has finished for PR 17150 at commit
|
|
Test build #3596 has finished for PR 17150 at commit
|
|
Test build #73999 has finished for PR 17150 at commit
|
|
Test build #3597 has finished for PR 17150 at commit
|
|
Test build #75707 has finished for PR 17150 at commit
|
|
Test build #76525 has finished for PR 17150 at commit
|
|
Test build #77124 has finished for PR 17150 at commit
|
|
Test build #77145 has finished for PR 17150 at commit
|
|
Test build #77251 has finished for PR 17150 at commit
|
|
Test build #77260 has finished for PR 17150 at commit
|
|
Test build #77265 has finished for PR 17150 at commit
|
|
Test build #3752 has started for PR 17150 at commit |
|
Test build #3754 has finished for PR 17150 at commit
|
|
Test build #3756 has finished for PR 17150 at commit
|
|
gently ping @srowen |
|
@jiangxb1987 this can't be merges until 2.2 is released. It is open on purpose. |
|
I just noticed that spark 2.2.0 was released yesterday. Is this the right time to move on with this PR? |
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM, too.
|
I guess we can also remove thread safety tests at the bottom of |
| def accumulableCollection[R <% Growable[T] with TraversableOnce[T] with Serializable: ClassTag, T] | ||
| (initialValue: R): Accumulable[R, T] = { | ||
| // TODO the context bound (<%) above should be replaced with simple type bound and implicit | ||
| // conversion but is a breaking change. This should be fixed in Spark 3.x. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so context bound is not recommended in scala?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It generates a warning in scala 2.11, yeah
| val validUdfs = udfs.filter { udf => | ||
| // Check to make sure that the UDF can be evaluated with only the input of this child. | ||
| udf.references.subsetOf(child.outputSet) | ||
| }.toArray // Turn it into an array since iterators cannot be serialized in Scala 2.10 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we remove the toArray?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably, but I stayed conservative here. At least the comment is obsolete
|
LGTM, merging to master! we can address remaining comments in follow-up PR. Thanks for working on it! |
|
OK, I think we could have just made the suggested changes here too, but I suspect we will find a thing or two we have to fix up along the way anyway. |
|
BTW don't forget this comment: #17150 (comment) I went ahead merging this because the scala 2.10 build was broken :) |
|
BTW @cloud-fan @JoshRosen can I delete the master 2.10 compile Jenkins jobs? they're going to fail from here on out. (Not sure I have permissions to do it though.) |
|
@srowen, I'll disable those master branch 2.10 jobs and will update scripts to prevent them from being redeployed by accident. |
|
@srowen @JoshRosen After thinking more about it, will this PR make it harder to backport bug fixes PRs to the previous branch? I did a little search and seems we broke scala 2.10 a little frequent before: https://github.com/apache/spark/pulls?utf8=%E2%9C%93&q=is%3Apr%20is%3Aclosed%202.10%20in%3Atitle |
|
Do you mean, it'll be harder to know if a back-port broke the 2.10 build? yes. I don't think we have a choice though. We would detect the breakage eventually, and backports become rare over time. |
|
Maybe do it a bit later, when the backport rate drops? E.g. it's unlikely we still do a lot of backports when 2.3 is cut. |
|
Do what later? |
|
Do the removal (i.e. this PR). |
|
This is already merged though. This doesn't seem like a reason to revert -- there's not even an actual problem here. It's the same potential problem we've always had, that 2.10 breakage might only be caught by the Jenkins job. 2.10 breaks were not really common, all the less so for back-ports. I pledge to watch the compile jobs, if that helps. But we really need to get on to 2.12 support now. |
|
Are you working on 2.12? |
|
I do have a local branch with, I think, most of the key changes. I do want to unblock that work. But that's a little separate. Really I'm just narrowly arguing here that this change doesn't make it meaningfully harder to keep 2.10 working in older branches. It's no harder than it was previously, and I'm happy to keep an eye on it if anyone's concerned. |
…e Scala 2.10 ## What changes were proposed in this pull request? Follow up to a few comments on #17150 (comment) that couldn't be addressed before it was merged. ## How was this patch tested? Existing tests. Author: Sean Owen <[email protected]> Closes #18646 from srowen/SPARK-19810.2.
|
Hi all, I did not open a JIRA because I am not very sure if it is an issue and it sounds maybe my environment problem. However, I believe it is worth leaving a note. The build with Maven is fine ( ./build/sbt clean package./bin/spark-shellI could reproduce this with SBT on ...
I checked this is fine with Maven is fine on ...
I identified this commit with SBT before/after the commit on ...
It looks this one is introduced here. Could anyone double check this and see if I am mistaken? I just wonder if the same commands are consistently being failed on other environments (rather than the exact failure cause). |
|
Yeah this change did change some dependencies, and it's possible something happened to have commons-logging included before and now it doesn't, and we need an explicit dependency somewhere. I'll look into it, as it ought to be pretty easy to patch if there's a gap here. |
| <artifactId>jcl-over-slf4j</artifactId> | ||
| <version>${slf4j.version}</version> | ||
| <!-- <scope>runtime</scope> --> <!-- more correct, but scalac 2.10.3 doesn't like it --> | ||
| <scope>runtime</scope> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@HyukjinKwon it was this change. I think we need to retain it to keep commons-logging (well, the SLF4J shim for it) on the classpath. SBT isn't including it in the build somehow even though it's runtime scope, which is really correct. I think we've seen this with the Maven-SBT plugin before.
… compile scope for SBT build ## What changes were proposed in this pull request? jcl-over-slf4j dependency needs to be compile scope for SBT build, to make it available for commons-logging dependents like Hadoop apache#17150 (comment) https://github.com/apache/spark/pull/17150/files#r128728089 ## How was this patch tested? Manual tests Author: Sean Owen <[email protected]> Closes apache#18703 from srowen/SPARK-19810.2.
What changes were proposed in this pull request?
How was this patch tested?
Existing tests