Skip to content

Conversation

@s-urbaniak
Copy link

What changes were proposed in this pull request?

Previously the Mesos framework webui URL was being derived only from the Spark UI address leaving no possibility to configure it. This commit makes it configurable. If unset it falls back to the previous behavior.

Motivation:
This change is necessary in order to be able to install Spark on DCOS and to be able to give it a custom service link. The configured webui_url is configured to point to a reverse proxy in the DCOS environment.

How was this patch tested?

Locally, using unit tests and on DCOS testing and stable revision.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why expose both --webui-url and spark.mesos.dispatcher.webui.url. Isn't one sufficient?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could only take spark.mesos.dispatcher.webui.url, sure. I thought this is idiomatic in this context.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code here is a bit strange it forces you to pass the master string for example with --master flag although you can also define the master url in a file (as spark.master) with the --properties-file option.

Check here main args are parsed first and then any properties coming from a file: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/mesos/MesosClusterDispatcherArguments.scala#L33-L35

In the latter case we dont really need the --master flag yet it will fail if you dont provide it. The restriction is not there for other parameters like --zk and spark.mesos.deploy.zookeeper.url.

https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/mesos/MesosClusterDispatcher.scala#L98-L103

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are a trillion ways to set configuration in Spark, half of them are deprecated, and some only work with certain entrypoint classes.

Some entrypoints, like SparkSubmit, support setting arbitrary properties in commands w/ something like --conf. MesosClusterDispatcher does not. So I say we just rely on --properties-file, rather than adding a command argument just for this one property. It's already arbitrary which properties are exposed as special command parameters, so let's not add a new one.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agreed, I removed the config option and will go the --properties-file route.

@andrewor14
Copy link
Contributor

ok to test

@SparkQA
Copy link

SparkQA commented Feb 26, 2016

Test build #52011 has finished for PR 11369 at commit e6dedd5.

  • This patch fails Scala style tests.
  • This patch does not merge cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 26, 2016

Test build #52050 has finished for PR 11369 at commit 586d6de.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 26, 2016

Test build #52054 has finished for PR 11369 at commit e105b1c.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Worths to mention that properties coming from a driver submission request are propagated from here:
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/mesos/MesosRestServer.scala#L87-L97

Also there are properties coming from MesosClusterScheduler's conf (initialized from MesosClusterDispatcher conf- https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/mesos/MesosClusterDispatcher.scala#L62):

eg. spark.mesos.uris at
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterScheduler.scala#L407

Both of these sets of properties seem to merge to form the final set of driver's spark properties IMHO these should be isolated or clarified into the docs the concept seems complicated.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it is complicated and opaque. A JIRA for this problem is welcome.

@tnachen
Copy link
Contributor

tnachen commented Feb 26, 2016

@s-urbaniak You need to fix the scala style error:
/home/jenkins/workspace/SparkPullRequestBuilder/core/src/test/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackendSuite.scala:229: File line length exceeds 100 characters

@SparkQA
Copy link

SparkQA commented Feb 26, 2016

Test build #52067 has finished for PR 11369 at commit 3acdf74.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 29, 2016

Test build #52181 has finished for PR 11369 at commit 42f46dd.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 29, 2016

Test build #52184 has finished for PR 11369 at commit fa8570d.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What was the problem with the /?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After discussing with @s-urbaniak directly, I see it's a bug where the link doesn't work when application proxy is set.

@tnachen
Copy link
Contributor

tnachen commented Mar 1, 2016

@s-urbaniak I think we should keep the existing -p since users might actually depend on it. Otherwise this patch LGTM. @dragos you want to take a look?

@dragos
Copy link
Contributor

dragos commented Mar 3, 2016

LGTM.

@tnachen
Copy link
Contributor

tnachen commented Mar 7, 2016

@andrewor14 this LGTM to me and @dragos , can you take a look?

Sergiusz Urbaniak added 5 commits March 9, 2016 09:33
Previously the Mesos framework webui URL was being derived only from the
Spark UI address leaving no possibility to configure it. This commit
makes it configurable. If unset it falls back to the previous behavior.
@SparkQA
Copy link

SparkQA commented Mar 9, 2016

Test build #52742 has finished for PR 11369 at commit 46ea0ab.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@andrewor14
Copy link
Contributor

Merging into master.

@asfgit asfgit closed this in a4a0add Mar 10, 2016
sttts pushed a commit to d2iq-archive/spark that referenced this pull request Mar 15, 2016
Previously the Mesos framework webui URL was being derived only from the Spark UI address leaving no possibility to configure it. This commit makes it configurable. If unset it falls back to the previous behavior.

Motivation:
This change is necessary in order to be able to install Spark on DCOS and to be able to give it a custom service link. The configured `webui_url` is configured to point to a reverse proxy in the DCOS environment.

Locally, using unit tests and on DCOS testing and stable revision.

Author: Sergiusz Urbaniak <[email protected]>

Closes apache#11369 from s-urbaniak/sur-webui-url.

Conflicts:
	core/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackendSuite.scala
sttts pushed a commit to d2iq-archive/spark that referenced this pull request Mar 15, 2016
Previously the Mesos framework webui URL was being derived only from the Spark UI address leaving no possibility to configure it. This commit makes it configurable. If unset it falls back to the previous behavior.

Motivation:
This change is necessary in order to be able to install Spark on DCOS and to be able to give it a custom service link. The configured `webui_url` is configured to point to a reverse proxy in the DCOS environment.

Locally, using unit tests and on DCOS testing and stable revision.

Author: Sergiusz Urbaniak <[email protected]>

Closes apache#11369 from s-urbaniak/sur-webui-url.

Conflicts:
	core/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackendSuite.scala
roygao94 pushed a commit to roygao94/spark that referenced this pull request Mar 22, 2016
## What changes were proposed in this pull request?

Previously the Mesos framework webui URL was being derived only from the Spark UI address leaving no possibility to configure it. This commit makes it configurable. If unset it falls back to the previous behavior.

Motivation:
This change is necessary in order to be able to install Spark on DCOS and to be able to give it a custom service link. The configured `webui_url` is configured to point to a reverse proxy in the DCOS environment.

## How was this patch tested?

Locally, using unit tests and on DCOS testing and stable revision.

Author: Sergiusz Urbaniak <[email protected]>

Closes apache#11369 from s-urbaniak/sur-webui-url.
mgummelt pushed a commit to d2iq-archive/spark that referenced this pull request Aug 2, 2016
Previously the Mesos framework webui URL was being derived only from the Spark UI address leaving no possibility to configure it. This commit makes it configurable. If unset it falls back to the previous behavior.

Motivation:
This change is necessary in order to be able to install Spark on DCOS and to be able to give it a custom service link. The configured `webui_url` is configured to point to a reverse proxy in the DCOS environment.

Locally, using unit tests and on DCOS testing and stable revision.

Author: Sergiusz Urbaniak <[email protected]>

Closes apache#11369 from s-urbaniak/sur-webui-url.

Conflicts:
	core/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackendSuite.scala
mgummelt pushed a commit to d2iq-archive/spark that referenced this pull request Aug 2, 2016
Previously the Mesos framework webui URL was being derived only from the Spark UI address leaving no possibility to configure it. This commit makes it configurable. If unset it falls back to the previous behavior.

Motivation:
This change is necessary in order to be able to install Spark on DCOS and to be able to give it a custom service link. The configured `webui_url` is configured to point to a reverse proxy in the DCOS environment.

Locally, using unit tests and on DCOS testing and stable revision.

Author: Sergiusz Urbaniak <[email protected]>

Closes apache#11369 from s-urbaniak/sur-webui-url.

Conflicts:
	core/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackendSuite.scala
@mgummelt mgummelt deleted the sur-webui-url branch August 2, 2016 21:01
mgummelt pushed a commit to d2iq-archive/spark that referenced this pull request Mar 7, 2017
Previously the Mesos framework webui URL was being derived only from the Spark UI address leaving no possibility to configure it. This commit makes it configurable. If unset it falls back to the previous behavior.

Motivation:
This change is necessary in order to be able to install Spark on DCOS and to be able to give it a custom service link. The configured `webui_url` is configured to point to a reverse proxy in the DCOS environment.

Locally, using unit tests and on DCOS testing and stable revision.

Author: Sergiusz Urbaniak <[email protected]>

Closes apache#11369 from s-urbaniak/sur-webui-url.

Conflicts:
	core/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackendSuite.scala
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants