-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-32675][MESOS] --py-files option is appended without passing value for it #29499
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
ok to test |
|
cc @tnachen |
|
Test build #127717 has finished for PR 29499 at commit
|
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Merged to master for Apache Spark 3.1.0 on December 2020.
Thank you for your first contribution, @farhan5900 .
|
Welcome to the Apache Spark community. You are added to the Apache Spark contributor group and SPARK-32675 is assigned to you, @farhan5900 . |
…lue for it
### What changes were proposed in this pull request?
The PR checks for the emptiness of `--py-files` value and uses it only if it is not empty.
### Why are the changes needed?
There is a bug in Mesos cluster mode REST Submission API. It is using `--py-files` option without specifying any value for the conf `spark.submit.pyFiles` by the user.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
* Submitting an application to a Mesos cluster:
`curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{
"action": "CreateSubmissionRequest",
"appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
"clientSparkVersion": "3.0.0",
"appArgs": ["30"],
"environmentVariables": {},
"mainClass": "org.apache.spark.examples.SparkPi",
"sparkProperties": {
"spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
"spark.driver.supervise": "false",
"spark.executor.memory": "512m",
"spark.driver.memory": "512m",
"spark.submit.deployMode": "cluster",
"spark.app.name": "SparkPi",
"spark.master": "mesos://localhost:5050"
}}'`
* It should be able to pick the correct class and run the job successfully.
Closes apache#29499 from farhan5900/SPARK-32675.
Authored-by: farhan5900 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 8749f2e)
…lue for it
### What changes were proposed in this pull request?
The PR checks for the emptiness of `--py-files` value and uses it only if it is not empty.
### Why are the changes needed?
There is a bug in Mesos cluster mode REST Submission API. It is using `--py-files` option without specifying any value for the conf `spark.submit.pyFiles` by the user.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
* Submitting an application to a Mesos cluster:
`curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{
"action": "CreateSubmissionRequest",
"appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
"clientSparkVersion": "3.0.0",
"appArgs": ["30"],
"environmentVariables": {},
"mainClass": "org.apache.spark.examples.SparkPi",
"sparkProperties": {
"spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
"spark.driver.supervise": "false",
"spark.executor.memory": "512m",
"spark.driver.memory": "512m",
"spark.submit.deployMode": "cluster",
"spark.app.name": "SparkPi",
"spark.master": "mesos://localhost:5050"
}}'`
* It should be able to pick the correct class and run the job successfully.
Closes apache#29499 from farhan5900/SPARK-32675.
Authored-by: farhan5900 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
* [DCOS-54813] Base tech update from 2.4.0 to 2.4.3 (#62) * [DCOS-52207][Spark] Make Mesos Agent Blacklisting behavior configurable and more tolerant of failures. (#63) * [DCOS-58386] Node draining support for supervised drivers; Mesos Java bump to 1.9.0 (#65) * [DCOS-58389] Role propagation and enforcement support for Mesos Dispatcher (#66) * DCOS-57560: Suppress/revive support for Mesos (#67) * D2IQ-64778: Unregister progress listeners before stopping executor threads. * [SPARK-32675][MESOS] --py-files option is appended without passing value for it ### What changes were proposed in this pull request? The PR checks for the emptiness of `--py-files` value and uses it only if it is not empty. ### Why are the changes needed? There is a bug in Mesos cluster mode REST Submission API. It is using `--py-files` option without specifying any value for the conf `spark.submit.pyFiles` by the user. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? * Submitting an application to a Mesos cluster: `curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{ "action": "CreateSubmissionRequest", "appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar", "clientSparkVersion": "3.0.0", "appArgs": ["30"], "environmentVariables": {}, "mainClass": "org.apache.spark.examples.SparkPi", "sparkProperties": { "spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar", "spark.driver.supervise": "false", "spark.executor.memory": "512m", "spark.driver.memory": "512m", "spark.submit.deployMode": "cluster", "spark.app.name": "SparkPi", "spark.master": "mesos://localhost:5050" }}'` * It should be able to pick the correct class and run the job successfully. Closes apache#29499 from farhan5900/SPARK-32675. Authored-by: farhan5900 <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]> (cherry picked from commit 8749f2e) * Applied all the remaining custom commits * Removes duplicate org.scala-lang.modules dependency * Removes duplicate gpus method * Adds second parameter of deprecated annotation * Fix typemismatch * Removes extra shellEscape function calls * Adds env variable for file based auth * Remove generic shell escape and put it in specific places * fix ambiguous import * fix shell escaping for `--conf` options * fix option type in test * Fixes scalastyle and unit test issues Co-authored-by: Alexander Lembiewski <[email protected]> Co-authored-by: Farhan <[email protected]> Co-authored-by: Anton Kirillov <[email protected]> Co-authored-by: Anton Kirillov <[email protected]> Co-authored-by: Roman Palaznik <[email protected]> Co-authored-by: Roman Palaznik <[email protected]>
What changes were proposed in this pull request?
The PR checks for the emptiness of
--py-filesvalue and uses it only if it is not empty.Why are the changes needed?
There is a bug in Mesos cluster mode REST Submission API. It is using
--py-filesoption without specifying any value for the confspark.submit.pyFilesby the user.Does this PR introduce any user-facing change?
No
How was this patch tested?
curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{ "action": "CreateSubmissionRequest", "appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar", "clientSparkVersion": "3.0.0", "appArgs": ["30"], "environmentVariables": {}, "mainClass": "org.apache.spark.examples.SparkPi", "sparkProperties": { "spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar", "spark.driver.supervise": "false", "spark.executor.memory": "512m", "spark.driver.memory": "512m", "spark.submit.deployMode": "cluster", "spark.app.name": "SparkPi", "spark.master": "mesos://localhost:5050" }}'