Skip to content

Conversation

@viirya
Copy link
Member

@viirya viirya commented Dec 16, 2015

JIRA: https://issues.apache.org/jira/browse/SPARK-12289

This change is needed for #10283. Without this, JavaDataFrameSuite will be failed when we support UnsafeRow in LocalTableScan.

Support in Limit is added first. TakeOrderedAndProject will be added later.

@SparkQA
Copy link

SparkQA commented Dec 16, 2015

Test build #47820 has finished for PR 10330 at commit acb3a58.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@viirya viirya changed the title [SPARK-12289][WIP][SQL] Support UnsafeRow in TakeOrderedAndProject/Limit [SPARK-12289][SQL] Support UnsafeRow in TakeOrderedAndProject/Limit Dec 17, 2015
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

InterpretedProjection can be replaced by UnsafeProjection?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is ok.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, I still have a dumb question. When calling the eval of each of the specified expressions, how can we know they can process unsafe rows? Why does the planner insert unsafe->safe conversion in the original design of TakeOrderedAndProject?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is just because not all expressions support unsafe before.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So now all the expressions can support unsafe?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think so. If there are expressions still not supporting unsafe, we should make it support.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it. Thank you!

@SparkQA
Copy link

SparkQA commented Dec 17, 2015

Test build #47892 has finished for PR 10330 at commit ecf7ec8.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 17, 2015

Test build #47890 has finished for PR 10330 at commit 304c94a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 17, 2015

Test build #47901 has finished for PR 10330 at commit ba06795.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 17, 2015

Test build #47916 has finished for PR 10330 at commit c343447.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@viirya
Copy link
Member Author

viirya commented Dec 22, 2015

cc @davies @cloud-fan @marmbrus

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what if the projectList is None?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added few lines to check if we need do extra unsafe projection for it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What you have done when projectList is None is exactly same with ConvertToUnsafe right? How about we change this to if (projectList.isDefined) true else child.outputsUnsafeRows, then our framework can insert ConvertToUnsafe if it's necessary.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK. I've updated it.

@SparkQA
Copy link

SparkQA commented Dec 24, 2015

Test build #48296 has finished for PR 10330 at commit fdc0097.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@viirya
Copy link
Member Author

viirya commented Dec 25, 2015

@cloud-fan any other comments?

@SparkQA
Copy link

SparkQA commented Dec 25, 2015

Test build #48323 has finished for PR 10330 at commit 04eb37e.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need copy here? We have already copied the rows when getting data.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK. I was thinking that it is needed to copy the returned row because it is the same object. But after I checked GenerateUnsafeProjection, looks like it will create new row every time. I've updated it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems it will be a problem without this copy(). HiveCompatibilitySuite will be failed.

[info]   key    value
[info]   !== HIVE - 5 row(s) ==   == CATALYST - 5 row(s) ==
[info]   !0 val_0                 4 val_4
[info]   !0 val_0                 4 val_4
[info]   !0 val_0                 4 val_4
[info]   !2 val_2                 4 val_4
[info]    4 val_4                 4 val_4

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I re-checked GenerateUnsafeProjection, it will return the same unsafe row. So we should use another copy() here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh sorry I misread the code, the copy is needed even we already copied before takeOrdered.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cc @marmbrus @yhuai should we remove this test?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any other cases that we will have the conversion? Or, we can create a dummy operator that only accepts safe rows. So, we can still test the logic of adding conversions.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a dummy node for it. Thanks.

@cloud-fan
Copy link
Contributor

overall LGTM

@SparkQA
Copy link

SparkQA commented Dec 25, 2015

Test build #48324 has finished for PR 10330 at commit 6a519d8.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 26, 2015

Test build #48331 has finished for PR 10330 at commit f3054d6.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@viirya
Copy link
Member Author

viirya commented Dec 28, 2015

ping @marmbrus @yhuai Please help see if this patch is ok for you. Thanks.

@SparkQA
Copy link

SparkQA commented Dec 29, 2015

Test build #48409 has finished for PR 10330 at commit c44c93a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • case class DummySafeNode(limit: Int, child: SparkPlan) extends UnaryNode

@viirya
Copy link
Member Author

viirya commented Dec 29, 2015

@yhuai @marmbrus Please see if now it is good to merge this patch into the code. Thanks.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be easy that just process UnsafeRow and output UnsafeRow?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So I need to close this again.....

@viirya viirya closed this Dec 30, 2015
ghost pushed a commit to dbtsai/spark that referenced this pull request Jan 1, 2016
…ut UnsafeRow

It's confusing that some operator output UnsafeRow but some not, easy to make mistake.

This PR change to only output UnsafeRow for all the operators (SparkPlan), removed the rule to insert Unsafe/Safe conversions. For those that can't output UnsafeRow directly, added UnsafeProjection into them.

Closes apache#10330

cc JoshRosen rxin

Author: Davies Liu <[email protected]>

Closes apache#10511 from davies/unsafe_row.
@viirya viirya deleted the limit-outputunsafe branch December 27, 2023 18:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants