Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 10 additions & 1 deletion core/src/main/scala/org/apache/spark/deploy/RRunner.scala
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,16 @@ object RRunner {

// Time to wait for SparkR backend to initialize in seconds
val backendTimeout = sys.env.getOrElse("SPARKR_BACKEND_TIMEOUT", "120").toInt
val rCommand = "Rscript"
val rCommand = {
// "spark.sparkr.r.command" is deprecated and replaced by "spark.r.command",
// but kept here for backward compatibility.
var cmd = sys.props.getOrElse("spark.sparkr.r.command", "Rscript")
cmd = sys.props.getOrElse("spark.r.command", cmd)
if (sys.props.getOrElse("spark.submit.deployMode", "client") == "client") {
cmd = sys.props.getOrElse("spark.r.driver.command", cmd)
}
cmd
}

// Check if the file path exists.
// If not, change directory to current working directory for YARN cluster mode
Expand Down
18 changes: 18 additions & 0 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -1592,6 +1592,20 @@ Apart from these, the following properties are also available, and may be useful
Number of threads used by RBackend to handle RPC calls from SparkR package.
</td>
</tr>
<tr>
<td><code>spark.r.command</code></td>
<td>Rscript</td>
<td>
Executable for executing R scripts in cluster modes for both driver and workers.
</td>
</tr>
<tr>
<td><code>spark.r.driver.command</code></td>
<td>spark.r.command</td>
<td>
Executable for executing R scripts in client modes for driver. Ignored in cluster modes.
</td>
</tr>
</table>

#### Cluster Managers
Expand Down Expand Up @@ -1631,6 +1645,10 @@ The following variables can be set in `spark-env.sh`:
<td><code>PYSPARK_DRIVER_PYTHON</code></td>
<td>Python binary executable to use for PySpark in driver only (default is <code>PYSPARK_PYTHON</code>).</td>
</tr>
<tr>
<td><code>SPARKR_DRIVER_R</code></td>
<td>R binary executable to use for SparkR shell (default is <code>R</code>).</td>
</tr>
<tr>
<td><code>SPARK_LOCAL_IP</code></td>
<td>IP address of the machine to bind to.</td>
Expand Down