-
Couldn't load subscription status.
- Fork 28.9k
[SPARK-20428][Core]REST interface about 'v1/submissions/ kill/', currently only supports delete a single ‘driver’, now let spark support delete some ‘drivers’ #17714
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ucceeded|failed|unknown]
…remove redundant description.
…, currently only supports delete a single ‘driver’, now let spark support delete some ‘drivers’
|
@HyukjinKwon Help with code review,thank you. |
|
I guess I am not used to this code path. Git blame says @tnachen changed the codes lately. |
|
@tnachen Help with code review,thank you. |
| response: HttpServletResponse): Unit = { | ||
| val submissionId = parseSubmissionId(request.getPathInfo) | ||
| val responseMessage = submissionId.map(handleKill).getOrElse { | ||
| val submissionIds = parseSubmissionId(request.getPathInfo) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think having submission Ids parsed on the request path is a good idea.
I would assume most use cases for batch delete is required when you have a larger number of drivers to delete (otherwise you would be fine just deleting a few one by one).
But most URLs are length limited.
You might be better have creating a new request for this that takes a body
|
@tnachen Batch delete e.g., Single delete e.g., |
|
Hello, This PR has been created for some time now. please help to review the code,thanks. |
|
Unfortunately I'm not a committer, so need to loop in someone who is to help merge it though. @srowen do you know who's responsible for the general deploy package? |
|
@srowen |
|
@SparkQA please test it. |
|
ok to test |
|
One general question: what do we expect to gain from this improvement? Seems it's just looping the drivers and send kill requests? |
|
@jiangxb1987 |
|
Let me be explicit - I don't think the improvement is really needed in Spark, as long as it's just looping the drivers and sending blocking KillDriver messages, since we gain nothing on performance issue this way. If you have a huge cluster with many drivers dying simultaneously(which, in my mind, should be really extreme case), then it's fine you write a script to call from outside of Spark. |
|
Really appreciate your contribution! Sorry, based on the comment, we might need to close this PR, but please submit more PRs in the future. Thanks again! |
What changes were proposed in this pull request?
Make a post REST interface request:
http://ip:6066/v1/submissions/kill/driver-20170421111514-0000
Currently only supports delete a single ‘driver’.But our large data management platform, hope to delete some 'drivers' in a request.
Because these drivers may be abnormal situation.
Now i let spark support delete some ‘drivers’.
When a post request:
http://zdh120:6066/v1/submissions/kill/**driver-20170421111514-0000,driver-20170421111515-0001,driver-20170421111517-0002,driver-20170421111517-0003**
'submissionId' must be separated by commas.Through this interface, I can delete four drivers in a request together.
How was this patch tested?
manual tests
Please review http://spark.apache.org/contributing.html before opening a pull request.