-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-6246] [ec2] fixed support for more than 100 nodes #6267
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Can one of the admins verify this patch? |
|
ok to test |
|
Merged build triggered. |
|
Merged build started. |
|
Test build #33097 has started for PR 6267 at commit |
|
Test build #33097 has finished for PR 6267 at commit
|
|
Merged build finished. Test PASSed. |
|
Test PASSed. |
|
LGTM. Merging this. Thanks @alyaxey for the change and thanks @thisisdhaas for helping me review |
|
@alyaxey I can assign you the JIRA for credit but I don't know your JIRA name. |
|
@srowen Yes, assign me please. My JIRA username is the same: alyaxey (Alex Slusarenko). |
This is a small fix. But it is important for amazon users because as the ticket states, "spark-ec2 can't handle clusters with > 100 nodes" now. Author: alyaxey <[email protected]> Closes apache#6267 from alyaxey/ec2_100_nodes_fix and squashes the following commits: 1e0d747 [alyaxey] [SPARK-6246] fixed support for more than 100 nodes
This is a small fix. But it is important for amazon users because as the ticket states, "spark-ec2 can't handle clusters with > 100 nodes" now. Author: alyaxey <[email protected]> Closes apache#6267 from alyaxey/ec2_100_nodes_fix and squashes the following commits: 1e0d747 [alyaxey] [SPARK-6246] fixed support for more than 100 nodes
This is a small fix. But it is important for amazon users because as the ticket states, "spark-ec2 can't handle clusters with > 100 nodes" now. Author: alyaxey <[email protected]> Closes apache#6267 from alyaxey/ec2_100_nodes_fix and squashes the following commits: 1e0d747 [alyaxey] [SPARK-6246] fixed support for more than 100 nodes
This is a small fix. But it is important for amazon users because as the ticket states, "spark-ec2 can't handle clusters with > 100 nodes" now.