Skip to content

Conversation

@farhan5900
Copy link
Contributor

@farhan5900 farhan5900 commented Aug 21, 2020

What changes were proposed in this pull request?

The PR checks for the emptiness of --py-files value and uses it only if it is not empty.

Why are the changes needed?

There is a bug in Mesos cluster mode REST Submission API. It is using --py-files option without specifying any value for the conf spark.submit.pyFiles by the user.

Does this PR introduce any user-facing change?

No

How was this patch tested?

  • Submitting an application to a Mesos cluster:
    curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{ "action": "CreateSubmissionRequest", "appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar", "clientSparkVersion": "3.0.0", "appArgs": ["30"], "environmentVariables": {}, "mainClass": "org.apache.spark.examples.SparkPi", "sparkProperties": { "spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar", "spark.driver.supervise": "false", "spark.executor.memory": "512m", "spark.driver.memory": "512m", "spark.submit.deployMode": "cluster", "spark.app.name": "SparkPi", "spark.master": "mesos://localhost:5050" }}'
  • It should be able to pick the correct class and run the job successfully.

@HyukjinKwon
Copy link
Member

ok to test

@HyukjinKwon
Copy link
Member

cc @tnachen

@SparkQA
Copy link

SparkQA commented Aug 21, 2020

Test build #127717 has finished for PR 29499 at commit 33142bb.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM. Merged to master for Apache Spark 3.1.0 on December 2020.
Thank you for your first contribution, @farhan5900 .

@dongjoon-hyun
Copy link
Member

Welcome to the Apache Spark community. You are added to the Apache Spark contributor group and SPARK-32675 is assigned to you, @farhan5900 .

alexeygorobets pushed a commit to d2iq-archive/spark that referenced this pull request Oct 13, 2020
…lue for it

### What changes were proposed in this pull request?
The PR checks for the emptiness of `--py-files` value and uses it only if it is not empty.

### Why are the changes needed?
There is a bug in Mesos cluster mode REST Submission API. It is using `--py-files` option without specifying any value for the conf `spark.submit.pyFiles` by the user.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
* Submitting an application to a Mesos cluster:
`curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{
"action": "CreateSubmissionRequest",
"appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
"clientSparkVersion": "3.0.0",
"appArgs": ["30"],
"environmentVariables": {},
"mainClass": "org.apache.spark.examples.SparkPi",
"sparkProperties": {
  "spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
  "spark.driver.supervise": "false",
  "spark.executor.memory": "512m",
  "spark.driver.memory": "512m",
  "spark.submit.deployMode": "cluster",
  "spark.app.name": "SparkPi",
  "spark.master": "mesos://localhost:5050"
}}'`
* It should be able to pick the correct class and run the job successfully.

Closes apache#29499 from farhan5900/SPARK-32675.

Authored-by: farhan5900 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 8749f2e)
farhan5900 added a commit to d2iq-archive/spark that referenced this pull request Oct 16, 2020
…lue for it

### What changes were proposed in this pull request?
The PR checks for the emptiness of `--py-files` value and uses it only if it is not empty.

### Why are the changes needed?
There is a bug in Mesos cluster mode REST Submission API. It is using `--py-files` option without specifying any value for the conf `spark.submit.pyFiles` by the user.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
* Submitting an application to a Mesos cluster:
`curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{
"action": "CreateSubmissionRequest",
"appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
"clientSparkVersion": "3.0.0",
"appArgs": ["30"],
"environmentVariables": {},
"mainClass": "org.apache.spark.examples.SparkPi",
"sparkProperties": {
  "spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
  "spark.driver.supervise": "false",
  "spark.executor.memory": "512m",
  "spark.driver.memory": "512m",
  "spark.submit.deployMode": "cluster",
  "spark.app.name": "SparkPi",
  "spark.master": "mesos://localhost:5050"
}}'`
* It should be able to pick the correct class and run the job successfully.

Closes apache#29499 from farhan5900/SPARK-32675.

Authored-by: farhan5900 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
kaiwalyajoshi pushed a commit to d2iq-archive/spark that referenced this pull request Oct 23, 2020
* [DCOS-54813] Base tech update from 2.4.0 to 2.4.3 (#62)

* [DCOS-52207][Spark] Make Mesos Agent Blacklisting behavior configurable and more tolerant of failures. (#63)

* [DCOS-58386] Node draining support for supervised drivers; Mesos Java bump to 1.9.0 (#65)

* [DCOS-58389] Role propagation and enforcement support for Mesos Dispatcher (#66)

* DCOS-57560: Suppress/revive support for Mesos (#67)

* D2IQ-64778: Unregister progress listeners before stopping executor threads.

* [SPARK-32675][MESOS] --py-files option is appended without passing value for it

### What changes were proposed in this pull request?
The PR checks for the emptiness of `--py-files` value and uses it only if it is not empty.

### Why are the changes needed?
There is a bug in Mesos cluster mode REST Submission API. It is using `--py-files` option without specifying any value for the conf `spark.submit.pyFiles` by the user.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
* Submitting an application to a Mesos cluster:
`curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{
"action": "CreateSubmissionRequest",
"appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
"clientSparkVersion": "3.0.0",
"appArgs": ["30"],
"environmentVariables": {},
"mainClass": "org.apache.spark.examples.SparkPi",
"sparkProperties": {
  "spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
  "spark.driver.supervise": "false",
  "spark.executor.memory": "512m",
  "spark.driver.memory": "512m",
  "spark.submit.deployMode": "cluster",
  "spark.app.name": "SparkPi",
  "spark.master": "mesos://localhost:5050"
}}'`
* It should be able to pick the correct class and run the job successfully.

Closes apache#29499 from farhan5900/SPARK-32675.

Authored-by: farhan5900 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 8749f2e)

* Applied all the remaining custom commits

* Removes duplicate org.scala-lang.modules dependency

* Removes duplicate gpus method

* Adds second parameter of deprecated annotation

* Fix typemismatch

* Removes extra shellEscape function calls

* Adds env variable for file based auth

* Remove generic shell escape and put it in specific places

* fix ambiguous import

* fix shell escaping for `--conf` options

* fix option type in test

* Fixes scalastyle and unit test issues

Co-authored-by: Alexander Lembiewski <[email protected]>
Co-authored-by: Farhan <[email protected]>
Co-authored-by: Anton Kirillov <[email protected]>
Co-authored-by: Anton Kirillov <[email protected]>
Co-authored-by: Roman Palaznik <[email protected]>
Co-authored-by: Roman Palaznik <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants