Skip to content
This repository was archived by the owner on Jan 9, 2020. It is now read-only.

Conversation

@lins05
Copy link

@lins05 lins05 commented May 18, 2017

Close #276 .

How is the patch tested?

Manually add logical errors to the SparkPi example and check the output.

2017-05-18 15:33:11 INFO  Client:54 - Application spark-pi-1495121556913 failed with exit code 1. You may want to check the driver pod logs.

private def status: String = pod.map(_.getStatus().getContainerStatuses().toString())
.getOrElse("unknown")

private var driverPodExitCode: Int = 0
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make this an Option[Int] and only set when we get an exit code. Otherwise if something goes wrong clients of LoggingPodStatusWatcher might think exit code is 0 when it's not

}.mkString("")
}

def getDriverPodExitCode: Int = {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

private

@mccheah
Copy link

mccheah commented May 18, 2017

We shouldn't be making changes to V1 submission. Any changes should be done to V2 submission instead.

Incidentally this reminds me that V2 submission currently always runs in fire and forget mode - continuous monitoring mode needs to be re-done there.

@ash211
Copy link

ash211 commented May 18, 2017

@mccheah can we make changes to both? I don't think V2 is ready for people to move over to yet, though coming closer

@mccheah
Copy link

mccheah commented May 18, 2017

I think the logging pod status is the last feature that isn't in V2 but is in V1. It would be good to add that to V2 and then remove the V1 code path entirely.

@ash211
Copy link

ash211 commented May 18, 2017

@mccheah I'd ideally like to get confirmation from someone that's not us that they've run V2 and it worked, before deleting V1

@mccheah
Copy link

mccheah commented May 18, 2017

Except that would require us or someone building a custom version of Spark to actually change the code path itself. SparkSubmit currently is coded to use the V1 main class, but we would need it to switch to the V2 main class. And we don't want to make the code path configurable.

@mccheah
Copy link

mccheah commented May 19, 2017

@lins05 @ash211 I essentially re-built this in keeping in mind that we need the logging watcher in V2 in general as well, and that when we do, we'll want to include this logic to get the exit code in V2 as well. See #283.

@erikerlandson
Copy link
Member

@ash211 @mccheah What is involved in building a V2 version? Is it a matter of checking out the right branch, compiling, and spinning corresponding images?

@mccheah
Copy link

mccheah commented May 19, 2017

One has to change this line in SparkSubmit to point to the V2 submission client instead. Then, the driver image must correspond to what we have in this Dockerfile. Finally, all of the parameters for submission need to correspond to what's used in V2, which is currently largely undocumented. When we transition to V2 we will document everything that is required.

@mccheah
Copy link

mccheah commented May 19, 2017

I created #285 to discuss the actual transition. I'll work on a PR that switches the code paths and the documentation.

@ash211
Copy link

ash211 commented May 22, 2017

Included in #283

@ash211 ash211 closed this May 22, 2017
ifilonenko pushed a commit to ifilonenko/spark that referenced this pull request Feb 26, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants