Skip to content

Commit a3f1575

Browse files
belieferchenzhx
authored andcommitted
[SPARK-39858][SQL] Remove unnecessary AliasHelper or PredicateHelper for some rules
### What changes were proposed in this pull request? When I using `AliasHelper`, I find that some rules inherit it instead of using it. This PR removes unnecessary `AliasHelper` or `PredicateHelper` in the following cases: - The rule inherit `AliasHelper` instead of using it. In this case, we can remove `AliasHelper` directly. - The rule inherit `PredicateHelper` instead of using it. In this case, we can remove `PredicateHelper` directly. - The rule inherit `AliasHelper` and `PredicateHelper`. In fact, `PredicateHelper` already extends `AliasHelper`. In this case, we can remove `AliasHelper`. - The rule inherit `OperationHelper` and `PredicateHelper`. In fact, `OperationHelper` already extends `PredicateHelper`. In this case, we can remove `PredicateHelper`. - The rule inherit `PlanTest` and `PredicateHelper`. In fact, `PlanTest` already extends `PredicateHelper`. In this case, we can remove `PredicateHelper`. - The rule inherit `QueryTest` and `PredicateHelper`. In fact, `QueryTest` already extends `PredicateHelper`. In this case, we can remove `PredicateHelper`. ### Why are the changes needed? Remove unnecessary `AliasHelper` or `PredicateHelper` for some rules ### Does this PR introduce _any_ user-facing change? 'No'. Just improve the inner implementation. ### How was this patch tested? N/A Closes apache#37272 from beliefer/SPARK-39858. Authored-by: Jiaan Geng <[email protected]> Signed-off-by: Wenchen Fan <[email protected]>
1 parent 3e6ee7c commit a3f1575

File tree

22 files changed

+32
-42
lines changed

22 files changed

+32
-42
lines changed

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2369,7 +2369,7 @@ class Analyzer(override val catalogManager: CatalogManager)
23692369
*
23702370
* Note: CTEs are handled in CTESubstitution.
23712371
*/
2372-
object ResolveSubquery extends Rule[LogicalPlan] with PredicateHelper {
2372+
object ResolveSubquery extends Rule[LogicalPlan] {
23732373
/**
23742374
* Resolve the correlated expressions in a subquery, as if the expressions live in the outer
23752375
* plan. All resolved outer references are wrapped in an [[OuterReference]]
@@ -2538,7 +2538,7 @@ class Analyzer(override val catalogManager: CatalogManager)
25382538
* those in a HAVING clause or ORDER BY clause. These expressions are pushed down to the
25392539
* underlying aggregate operator and then projected away after the original operator.
25402540
*/
2541-
object ResolveAggregateFunctions extends Rule[LogicalPlan] with AliasHelper {
2541+
object ResolveAggregateFunctions extends Rule[LogicalPlan] {
25422542
def apply(plan: LogicalPlan): LogicalPlan = plan.resolveOperatorsUpWithPruning(
25432543
_.containsPattern(AGGREGATE), ruleId) {
25442544
// Resolve aggregate with having clause to Filter(..., Aggregate()). Note, to avoid wrongly

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/CostBasedJoinReorder.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -401,7 +401,7 @@ case class Cost(card: BigInt, size: BigInt) {
401401
*
402402
* Filters (2) and (3) are not implemented.
403403
*/
404-
object JoinReorderDPFilters extends PredicateHelper {
404+
object JoinReorderDPFilters {
405405
/**
406406
* Builds join graph information to be used by the filtering strategies.
407407
* Currently, it builds the sets of star/non-star joins.

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -728,7 +728,7 @@ object LimitPushDown extends Rule[LogicalPlan] {
728728
* safe to pushdown Filters and Projections through it. Filter pushdown is handled by another
729729
* rule PushDownPredicates. Once we add UNION DISTINCT, we will not be able to pushdown Projections.
730730
*/
731-
object PushProjectionThroughUnion extends Rule[LogicalPlan] with PredicateHelper {
731+
object PushProjectionThroughUnion extends Rule[LogicalPlan] {
732732

733733
/**
734734
* Maps Attributes from the left side to the corresponding Attribute on the right side.
@@ -1450,7 +1450,7 @@ object PruneFilters extends Rule[LogicalPlan] with PredicateHelper {
14501450
* This rule improves performance of predicate pushdown for cascading joins such as:
14511451
* Filter-Join-Join-Join. Most predicates can be pushed down in a single pass.
14521452
*/
1453-
object PushDownPredicates extends Rule[LogicalPlan] with PredicateHelper {
1453+
object PushDownPredicates extends Rule[LogicalPlan] {
14541454
def apply(plan: LogicalPlan): LogicalPlan = plan.transformWithPruning(
14551455
_.containsAnyPattern(FILTER, JOIN)) {
14561456
CombineFilters.applyLocally

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/expressions.scala

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,7 @@ object ConstantFolding extends Rule[LogicalPlan] {
8484
* - Using this mapping, replace occurrence of the attributes with the corresponding constant values
8585
* in the AND node.
8686
*/
87-
object ConstantPropagation extends Rule[LogicalPlan] with PredicateHelper {
87+
object ConstantPropagation extends Rule[LogicalPlan] {
8888
def apply(plan: LogicalPlan): LogicalPlan = plan.transformUpWithPruning(
8989
_.containsAllPatterns(LITERAL, FILTER), ruleId) {
9090
case f: Filter =>
@@ -496,7 +496,7 @@ object SimplifyBinaryComparison
496496
/**
497497
* Simplifies conditional expressions (if / case).
498498
*/
499-
object SimplifyConditionals extends Rule[LogicalPlan] with PredicateHelper {
499+
object SimplifyConditionals extends Rule[LogicalPlan] {
500500
private def falseOrNullLiteral(e: Expression): Boolean = e match {
501501
case FalseLiteral => true
502502
case Literal(null, _) => true
@@ -575,7 +575,7 @@ object SimplifyConditionals extends Rule[LogicalPlan] with PredicateHelper {
575575
/**
576576
* Push the foldable expression into (if / case) branches.
577577
*/
578-
object PushFoldableIntoBranches extends Rule[LogicalPlan] with PredicateHelper {
578+
object PushFoldableIntoBranches extends Rule[LogicalPlan] {
579579

580580
// To be conservative here: it's only a guaranteed win if all but at most only one branch
581581
// end up being not foldable.

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/planning/patterns.scala

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ import org.apache.spark.sql.catalyst.plans.logical._
2626
import org.apache.spark.sql.errors.QueryCompilationErrors
2727
import org.apache.spark.sql.internal.SQLConf
2828

29-
trait OperationHelper extends AliasHelper with PredicateHelper {
29+
trait OperationHelper extends PredicateHelper {
3030
import org.apache.spark.sql.catalyst.optimizer.CollapseProject.canCollapseExpressions
3131

3232
type ReturnType =
@@ -116,7 +116,7 @@ trait OperationHelper extends AliasHelper with PredicateHelper {
116116
* [[org.apache.spark.sql.catalyst.expressions.Alias Aliases]] are in-lined/substituted if
117117
* necessary.
118118
*/
119-
object PhysicalOperation extends OperationHelper with PredicateHelper {
119+
object PhysicalOperation extends OperationHelper {
120120
override protected def legacyMode: Boolean = true
121121
}
122122

@@ -125,7 +125,7 @@ object PhysicalOperation extends OperationHelper with PredicateHelper {
125125
* operations even if they are non-deterministic, as long as they satisfy the
126126
* requirement of CollapseProject and CombineFilters.
127127
*/
128-
object ScanOperation extends OperationHelper with PredicateHelper {
128+
object ScanOperation extends OperationHelper {
129129
override protected def legacyMode: Boolean = false
130130
}
131131

sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ExtractPredicatesWithinOutputSetSuite.scala

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -22,10 +22,7 @@ import org.apache.spark.sql.catalyst.dsl.expressions._
2222
import org.apache.spark.sql.catalyst.plans.PlanTest
2323
import org.apache.spark.sql.types.BooleanType
2424

25-
class ExtractPredicatesWithinOutputSetSuite
26-
extends SparkFunSuite
27-
with PredicateHelper
28-
with PlanTest {
25+
class ExtractPredicatesWithinOutputSetSuite extends SparkFunSuite with PlanTest {
2926
private val a = AttributeReference("A", BooleanType)(exprId = ExprId(1))
3027
private val b = AttributeReference("B", BooleanType)(exprId = ExprId(2))
3128
private val c = AttributeReference("C", BooleanType)(exprId = ExprId(3))

sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/BinaryComparisonSimplificationSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ import org.apache.spark.sql.catalyst.rules._
2828
import org.apache.spark.sql.internal.SQLConf
2929
import org.apache.spark.sql.types.{IntegerType, StructField, StructType}
3030

31-
class BinaryComparisonSimplificationSuite extends PlanTest with PredicateHelper {
31+
class BinaryComparisonSimplificationSuite extends PlanTest {
3232

3333
object Optimize extends RuleExecutor[LogicalPlan] {
3434
val batches =

sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/BooleanSimplificationSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ import org.apache.spark.sql.catalyst.plans.logical._
2828
import org.apache.spark.sql.catalyst.rules._
2929
import org.apache.spark.sql.types.BooleanType
3030

31-
class BooleanSimplificationSuite extends PlanTest with ExpressionEvalHelper with PredicateHelper {
31+
class BooleanSimplificationSuite extends PlanTest with ExpressionEvalHelper {
3232

3333
object Optimize extends RuleExecutor[LogicalPlan] {
3434
val batches =

sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/EliminateSubqueryAliasesSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ import org.apache.spark.sql.catalyst.plans.logical._
2828
import org.apache.spark.sql.catalyst.rules._
2929

3030

31-
class EliminateSubqueryAliasesSuite extends PlanTest with PredicateHelper {
31+
class EliminateSubqueryAliasesSuite extends PlanTest {
3232

3333
object Optimize extends RuleExecutor[LogicalPlan] {
3434
val batches = Batch("EliminateSubqueryAliases", Once, EliminateSubqueryAliases) :: Nil

sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/PushFoldableIntoBranchesSuite.scala

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,8 +32,7 @@ import org.apache.spark.sql.types.{BooleanType, IntegerType, StringType, Timesta
3232
import org.apache.spark.unsafe.types.CalendarInterval
3333

3434

35-
class PushFoldableIntoBranchesSuite
36-
extends PlanTest with ExpressionEvalHelper with PredicateHelper {
35+
class PushFoldableIntoBranchesSuite extends PlanTest with ExpressionEvalHelper {
3736

3837
object Optimize extends RuleExecutor[LogicalPlan] {
3938
val batches = Batch("PushFoldableIntoBranches", FixedPoint(50),

0 commit comments

Comments
 (0)