Skip to content

Conversation

maryannxue
Copy link
Contributor

What changes were proposed in this pull request?

Change insert input schema type: "insertRelationType" -> "insertRelationType.asNullable", in order to avoid nullable being overridden.

How was this patch tested?

Added one test in InsertSuite.

@SparkQA
Copy link

SparkQA commented Jun 18, 2018

Test build #92046 has finished for PR 21585 at commit f099a67.

  • This patch fails MiMa tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gatorsmile
Copy link
Member

retest this please

@SparkQA
Copy link

SparkQA commented Jun 19, 2018

Test build #92047 has finished for PR 21585 at commit f099a67.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@maryannxue
Copy link
Contributor Author

@gatorsmile @cloud-fan Could you please review this PR?


class SimpleInsertSource extends SchemaRelationProvider {
override def createRelation(
sqlContext: SQLContext,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

4 spaces indentation

// Apply the schema of the existing table to the new data.
val df = sparkSession.internalCreateDataFrame(data.queryExecution.toRdd, logicalRelation.schema)
relation.insert(df, overwrite)
// Data should have been casted to the schema of the insert relation.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's better to mention which rule did it

compressed = false,
properties = Map.empty),
schema = schema,
provider = Some("org.apache.spark.sql.sources.SimpleInsertSource"))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use classOf[SimpleInsertSource].getName instead of hardcoding

@cloud-fan
Copy link
Contributor

thanks, LGTM

@maryannxue
Copy link
Contributor Author

Done with the changes. Thanks a lot, @cloud-fan !

@SparkQA
Copy link

SparkQA commented Jun 19, 2018

Test build #92093 has finished for PR 21585 at commit 049844e.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor

retest this please

@SparkQA
Copy link

SparkQA commented Jun 19, 2018

Test build #92092 has finished for PR 21585 at commit 03c3c90.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 19, 2018

Test build #92091 has finished for PR 21585 at commit bb9fa03.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 19, 2018

Test build #92095 has finished for PR 21585 at commit 049844e.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gatorsmile
Copy link
Member

Thanks! Merged to master/2.3

asfgit pushed a commit that referenced this pull request Jun 19, 2018
## What changes were proposed in this pull request?

Change insert input schema type: "insertRelationType" -> "insertRelationType.asNullable", in order to avoid nullable being overridden.

## How was this patch tested?

Added one test in InsertSuite.

Author: Maryann Xue <[email protected]>

Closes #21585 from maryannxue/spark-24583.

(cherry picked from commit bc0498d)
Signed-off-by: Xiao Li <[email protected]>
@asfgit asfgit closed this in bc0498d Jun 19, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants