Skip to content

Commit e8837f9

Browse files
committed
Declare the ANSI SQL compliance options as experimental
1 parent d0f9614 commit e8837f9

File tree

1 file changed

+6
-4
lines changed

1 file changed

+6
-4
lines changed

docs/sql-ref-ansi-compliance.md

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -19,19 +19,21 @@ license: |
1919
limitations under the License.
2020
---
2121

22-
Spark SQL has two options to comply with the SQL standard: `spark.sql.ansi.enabled` and `spark.sql.storeAssignmentPolicy` (See a table below for details).
22+
Since Spark 3.0, Spark SQL introduces two experimental options to comply with the SQL standard: `spark.sql.ansi.enabled` and `spark.sql.storeAssignmentPolicy` (See a table below for details).
23+
2324
When `spark.sql.ansi.enabled` is set to `true`, Spark SQL follows the standard in basic behaviours (e.g., arithmetic operations, type conversion, and SQL parsing).
2425
Moreover, Spark SQL has an independent option to control implicit casting behaviours when inserting rows in a table.
2526
The casting behaviours are defined as store assignment rules in the standard.
26-
When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies with the ANSI store assignment rules.
27+
28+
When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies with the ANSI store assignment rules. This is a separate configuration because its default value is `ANSI`, while the configuration `spark.sql.ansi.enabled` is disabled by default.
2729

2830
<table class="table">
2931
<tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
3032
<tr>
3133
<td><code>spark.sql.ansi.enabled</code></td>
3234
<td>false</td>
3335
<td>
34-
When true, Spark tries to conform to the ANSI SQL specification:
36+
(Experimental) When true, Spark tries to conform to the ANSI SQL specification:
3537
1. Spark will throw a runtime exception if an overflow occurs in any operation on integral/decimal field.
3638
2. Spark will forbid using the reserved keywords of ANSI SQL as identifiers in the SQL parser.
3739
</td>
@@ -40,7 +42,7 @@ When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies with
4042
<td><code>spark.sql.storeAssignmentPolicy</code></td>
4143
<td>ANSI</td>
4244
<td>
43-
When inserting a value into a column with different data type, Spark will perform type coercion.
45+
(Experimental) When inserting a value into a column with different data type, Spark will perform type coercion.
4446
Currently, we support 3 policies for the type coercion rules: ANSI, legacy and strict. With ANSI policy,
4547
Spark performs the type coercion as per ANSI SQL. In practice, the behavior is mostly the same as PostgreSQL.
4648
It disallows certain unreasonable type conversions such as converting string to int or double to boolean.

0 commit comments

Comments
 (0)