You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-43838][SQL][FOLLOWUP] Add missing aggregate in renewDuplicatedRelations
### What changes were proposed in this pull request?
This is a follow up PR for #41347 , add missing aggregate case in `renewDuplicatedRelations`
### Why are the changes needed?
add missing case
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
exist test.
Closes#42160 from Hisoka-X/SPARK-43838_subquery_aggregate_follow_up.
Authored-by: Jia Fan <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
Results [5]: [channel#37, id#38, cast(sum(sales#39)#139 as decimal(37,2)) AS sales#142, cast(sum(returns#40)#140 as decimal(37,2)) AS returns#143, cast(sum(profit#41)#141 as decimal(38,2)) AS profit#144]
Results [5]: [channel#145, null AS id#168, sum(sales#150)#165 AS sum(sales)#169, sum(returns#151)#166 AS sum(returns)#170, sum(profit#152)#167 AS sum(profit)#171]
Results [5]: [channel#145, null AS id#174, sum(sales#156)#171 AS sum(sales)#175, sum(returns#157)#172 AS sum(returns)#176, sum(profit#158)#173 AS sum(profit)#177]
Results [5]: [null AS channel#195, null AS id#196, sum(sales#177)#192 AS sum(sales)#197, sum(returns#178)#193 AS sum(returns)#198, sum(profit#179)#194 AS sum(profit)#199]
Results [5]: [null AS channel#207, null AS id#208, sum(sales#189)#204 AS sum(sales)#209, sum(returns#190)#205 AS sum(returns)#210, sum(profit#191)#206 AS sum(profit)#211]
501
501
502
502
(86) Union
503
503
@@ -534,22 +534,22 @@ BroadcastExchange (95)
534
534
535
535
536
536
(91) Scan parquet spark_catalog.default.date_dim
537
-
Output [2]: [d_date_sk#24, d_date#200]
537
+
Output [2]: [d_date_sk#24, d_date#212]
538
538
Batched: true
539
539
Location [not included in comparison]/{warehouse_dir}/date_dim]
Results [5]: [channel#37, id#38, cast(sum(sales#39)#139 as decimal(37,2)) AS sales#142, cast(sum(returns#40)#140 as decimal(37,2)) AS returns#143, cast(sum(profit#41)#141 as decimal(38,2)) AS profit#144]
Results [5]: [channel#145, null AS id#168, sum(sales#150)#165 AS sum(sales)#169, sum(returns#151)#166 AS sum(returns)#170, sum(profit#152)#167 AS sum(profit)#171]
Results [5]: [channel#145, null AS id#174, sum(sales#156)#171 AS sum(sales)#175, sum(returns#157)#172 AS sum(returns)#176, sum(profit#158)#173 AS sum(profit)#177]
Results [5]: [null AS channel#195, null AS id#196, sum(sales#177)#192 AS sum(sales)#197, sum(returns#178)#193 AS sum(returns)#198, sum(profit#179)#194 AS sum(profit)#199]
Results [5]: [null AS channel#207, null AS id#208, sum(sales#189)#204 AS sum(sales)#209, sum(returns#190)#205 AS sum(returns)#210, sum(profit#191)#206 AS sum(profit)#211]
486
486
487
487
(83) Union
488
488
@@ -519,22 +519,22 @@ BroadcastExchange (92)
519
519
520
520
521
521
(88) Scan parquet spark_catalog.default.date_dim
522
-
Output [2]: [d_date_sk#22, d_date#200]
522
+
Output [2]: [d_date_sk#22, d_date#212]
523
523
Batched: true
524
524
Location [not included in comparison]/{warehouse_dir}/date_dim]
0 commit comments