Skip to content

Conversation

alyaxey
Copy link

@alyaxey alyaxey commented May 19, 2015

This is a small fix. But it is important for amazon users because as the ticket states, "spark-ec2 can't handle clusters with > 100 nodes" now.

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@srowen
Copy link
Member

srowen commented May 19, 2015

ok to test

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@SparkQA
Copy link

SparkQA commented May 19, 2015

Test build #33097 has started for PR 6267 at commit 1e0d747.

@shivaram
Copy link
Contributor

Thanks @alyaxey -- Functionality looks good to me.

cc @nchammas for python style

@SparkQA
Copy link

SparkQA commented May 19, 2015

Test build #33097 has finished for PR 6267 at commit 1e0d747.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Merged build finished. Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/33097/
Test PASSed.

@shivaram
Copy link
Contributor

LGTM. Merging this. Thanks @alyaxey for the change and thanks @thisisdhaas for helping me review

@asfgit asfgit closed this in 2bc5e06 May 19, 2015
@srowen
Copy link
Member

srowen commented May 21, 2015

@alyaxey I can assign you the JIRA for credit but I don't know your JIRA name.

@alyaxey
Copy link
Author

alyaxey commented May 22, 2015

@srowen Yes, assign me please. My JIRA username is the same: alyaxey (Alex Slusarenko).

jeanlyn pushed a commit to jeanlyn/spark that referenced this pull request May 28, 2015
This is a small fix. But it is important for amazon users because as the ticket states, "spark-ec2 can't handle clusters with > 100 nodes" now.

Author: alyaxey <[email protected]>

Closes apache#6267 from alyaxey/ec2_100_nodes_fix and squashes the following commits:

1e0d747 [alyaxey] [SPARK-6246] fixed support for more than 100 nodes
jeanlyn pushed a commit to jeanlyn/spark that referenced this pull request Jun 12, 2015
This is a small fix. But it is important for amazon users because as the ticket states, "spark-ec2 can't handle clusters with > 100 nodes" now.

Author: alyaxey <[email protected]>

Closes apache#6267 from alyaxey/ec2_100_nodes_fix and squashes the following commits:

1e0d747 [alyaxey] [SPARK-6246] fixed support for more than 100 nodes
nemccarthy pushed a commit to nemccarthy/spark that referenced this pull request Jun 19, 2015
This is a small fix. But it is important for amazon users because as the ticket states, "spark-ec2 can't handle clusters with > 100 nodes" now.

Author: alyaxey <[email protected]>

Closes apache#6267 from alyaxey/ec2_100_nodes_fix and squashes the following commits:

1e0d747 [alyaxey] [SPARK-6246] fixed support for more than 100 nodes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants