You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This patch cements our deprecation of the SPARK_MEM environment variable
by replacing it with case-specific variables:
SPARK_DAEMON_MEMORY, SPARK_EXECUTOR_MEMORY, and SPARK_DRIVER_MEMORY
The creation of the latter two variables means that we can safely
set driver/job memory without accidentally setting the executor memory.
Neither is public.
SPARK_EXECUTOR_MEMORY is only used by the Mesos scheduler (and set
within SparkContext). The proper way of configuring executor memory
is through the "spark.executor.memory" property.
SPARK_DRIVER_MEMORY is the new way of specifying the amount of memory
run by jobs launched by spark-class, without possibly affecting executor
memory.
Other memory considerations:
- The repl's memory can be set through the "--drivermem" command-line option,
which really just sets SPARK_DRIVER_MEMORY.
- run-example doesn't use spark-class, so the only way to modify examples'
memory is actually an unusual use of SPARK_JAVA_OPTS (which is normally
overriden in all cases by spark-class).
This patch also fixes a lurking bug where spark-shell misused spark-class
(the first argument is supposed to be the main class name, not java
options).
0 commit comments