You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/configuration.md
+15-9Lines changed: 15 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -30,22 +30,28 @@ val conf = new SparkConf()
30
30
val sc = new SparkContext(conf)
31
31
{% endhighlight %}
32
32
33
-
## Loading Default Configurations
33
+
## Dynamically Loading Spark Properties
34
+
In some cases, you may want to avoid hard-coding certain configurations in a `SparkConf`. For
35
+
instance, if you'd like to run the same applicaiton with different masters or different
36
+
amounts of memory.
34
37
35
-
In the case of `spark-shell`, a SparkContext has already been created for you, so you cannot control
36
-
the configuration properties through SparkConf. However, you can still set configuration properties
37
-
through a default configuration file. By default, `spark-shell` (and more generally `spark-submit`)
38
-
will read configuration options from `conf/spark-defaults.conf`, in which each line consists of a
39
-
key and a value separated by whitespace. For example,
38
+
The Spark shell and [`spark-submit`](cluster-overview.html#launching-applications-with-spark-submit) tool support two ways to load configurations dynamically.
39
+
When a SparkConf is created, it will read configuration options from `conf/spark-defaults.conf`,
40
+
in which each line consists of a key and a value separated by whitespace. For example,
Any values specified in the file will be passed on to the application, and merged with those
47
-
specified through SparkConf. If the same configuration property exists in both `spark-defaults.conf`
48
-
and SparkConf, then the latter will take precedence as it is the most application-specific.
47
+
48
+
In addition, when launching programs with the [`spark-submit`](cluster-overview.html#launching-applications-with-spark-submit) tool, certain options can be configured as flags. For instance, the
49
+
`--master` flag to `spark-submit` will automatically set the master. Run `./bin/spark-submit --help` to see the entire list of options.
50
+
51
+
Any values specified as flags or in the properties file will be passed on to the application
52
+
and merged with those specified through SparkConf. Properties set directly on the SparkConf
53
+
take highest precedence, then flags passed to `spark-submit` or `spark-shell`, then options
0 commit comments