-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-30703][SQL][DOCS][FollowUp] Declare the ANSI SQL compliance options as experimental #27590
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #118455 has finished for PR 27590 at commit
|
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, @gengliangwang . Thank you for pinging me.
Do you think the new features/behaviors will revert the existing provided behavior?
The options are experimental. There can be new features/behaviors in future releases.
@dongjoon-hyun I think there can be new features for the |
| <td>false</td> | ||
| <td> | ||
| When true, Spark tries to conform to the ANSI SQL specification: | ||
| (Experimental) When true, Spark tries to conform to the ANSI SQL specification: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does Experimental mean? That is the same with the @Experimental annotation? Anyway, this statement is just copied from one in the SQL configuration document (#27459). So, instead of adding the prefix (Experimental) manually, I personally think its better to add this prefix automatically via sql/gen-sql-config-docs.py. For example, how about adding an experimental method in ConfigBuilder;
val ANSI_ENABLED = buildConf("spark.sql.ansi.enabled")
.experimental()
.doc("When true, Spark tries to conform to the ANSI SQL specification: 1. Spark will " +
"throw a runtime exception if an overflow occurs in any operation on integral/decimal " +
"field. 2. Spark will forbid using the reserved keywords of ANSI SQL as identifiers in " +
"the SQL parser.")
.booleanConf
.createWithDefault(false)
Then, the script adds (Experimental) in the head of its description.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@maropu I was following this one https://spark.apache.org/docs/latest/configuration.html
I think it's a good idea to add a new method in ConfigBuilder, but I prefer to keep it in this way for this PR, as the configuration table here is not generated from the code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, I think the fix itself in this pr looks fine. cc: @dongjoon-hyun @cloud-fan @HyukjinKwon
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general, this changes make sense to me. If we have more experimental public conf, we should consider doing it in a proper way.
Regarding the ANSI mode, we need to consider the roadmap of Spark 3.x: which are still missing and what kind of behavior changes we plan to add.
…tions as experimental ### What changes were proposed in this pull request? This is a follow-up of #27489. It declares the ANSI SQL compliance options as experimental in the documentation. ### Why are the changes needed? The options are experimental. There can be new features/behaviors in future releases. ### Does this PR introduce any user-facing change? No ### How was this patch tested? Generating doc Closes #27590 from gengliangwang/ExperimentalAnsi. Authored-by: Gengliang Wang <[email protected]> Signed-off-by: HyukjinKwon <[email protected]> (cherry picked from commit da2ca85) Signed-off-by: HyukjinKwon <[email protected]>
| The casting behaviours are defined as store assignment rules in the standard. | ||
| When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies with the ANSI store assignment rules. | ||
|
|
||
| When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies with the ANSI store assignment rules. This is a separate configuration because its default value is `ANSI`, while the configuration `spark.sql.ansi.enabled` is disabled by default. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
separate -> separated?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Separate can be an adjective
…tions as experimental ### What changes were proposed in this pull request? This is a follow-up of apache#27489. It declares the ANSI SQL compliance options as experimental in the documentation. ### Why are the changes needed? The options are experimental. There can be new features/behaviors in future releases. ### Does this PR introduce any user-facing change? No ### How was this patch tested? Generating doc Closes apache#27590 from gengliangwang/ExperimentalAnsi. Authored-by: Gengliang Wang <[email protected]> Signed-off-by: HyukjinKwon <[email protected]>
|
@gengliangwang @dongjoon-hyun |
|
I think we can remove the experimental mark now. What do you think? @gengliangwang |
|
@tanvn nice catch! |
What changes were proposed in this pull request?
This is a follow-up of #27489.
It declares the ANSI SQL compliance options as experimental in the documentation.
Why are the changes needed?
The options are experimental. There can be new features/behaviors in future releases.
Does this PR introduce any user-facing change?
No
How was this patch tested?
Generating doc