Skip to content

Conversation

@lresende
Copy link
Member

Add integration tests based on docker to test DB2 JDBC dialect support

@JoshRosen
Copy link
Contributor

Jenkins, this is ok to test.

@SparkQA
Copy link

SparkQA commented Nov 22, 2015

Test build #46496 has finished for PR 9893 at commit dd0fe5f.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):\n * class Db2IntegrationSuite extends DockerJDBCIntegrationSuite\n * assert(types(0).equals(\"class java.lang.Integer\"))\n * assert(types(1).equals(\"class java.lang.String\"))\n * assert(types(0).equals(\"class java.lang.Boolean\"))\n * assert(types(1).equals(\"class java.lang.Long\"))\n * assert(types(2).equals(\"class java.lang.Integer\"))\n * assert(types(3).equals(\"class java.lang.Integer\"))\n * assert(types(4).equals(\"class java.lang.Integer\"))\n * assert(types(5).equals(\"class java.lang.Long\"))\n * assert(types(6).equals(\"class java.math.BigDecimal\"))\n * assert(types(7).equals(\"class java.lang.Double\"))\n * assert(types(8).equals(\"class java.lang.Double\"))\n * assert(types(0).equals(\"class java.sql.Date\"))\n * assert(types(1).equals(\"class java.sql.Timestamp\"))\n * assert(types(2).equals(\"class java.sql.Timestamp\"))\n * assert(types(3).equals(\"class java.sql.Timestamp\"))\n * assert(types(4).equals(\"class java.sql.Date\"))\n * assert(types(0).equals(\"class java.lang.String\"))\n * assert(types(1).equals(\"class java.lang.String\"))\n * assert(types(2).equals(\"class java.lang.String\"))\n * assert(types(3).equals(\"class java.lang.String\"))\n * assert(types(4).equals(\"class java.lang.String\"))\n * assert(types(5).equals(\"class java.lang.String\"))\n * assert(types(6).equals(\"class [B\"))\n * assert(types(7).equals(\"class [B\"))\n * assert(types(8).equals(\"class [B\"))\n

@lresende
Copy link
Member Author

Jenkins, this is ok to test.

@SparkQA
Copy link

SparkQA commented Dec 16, 2015

Test build #47866 has finished for PR 9893 at commit 3772ea3.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@lresende
Copy link
Member Author

@JoshRosen , this is currently failing due to missing dependency, could you help add the DB2 jars on the jenkins slaves ? The directions is available in the pom.xml in the following commit : lresende@3772ea3

@JoshRosen
Copy link
Contributor

What's the licensing for the DB2 JARs? Could we publish them to Maven under the Spark project namespace? Just wondering if there's a way to avoid having to add this manual install.

@SparkQA
Copy link

SparkQA commented Jan 15, 2016

Test build #49495 has finished for PR 9893 at commit 679e701.

  • This patch fails build dependency tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jan 18, 2016

Test build #49563 has finished for PR 9893 at commit 48b7a5f.

  • This patch fails build dependency tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you might want to check, this PR #10796 is reverting this exact line.

perhaps if you rebase to latest on master that you should have this test fixed

@lresende lresende changed the title [SPARK-10521][SQL][WIP] Utilize Docker for test DB2 JDBS Dialect support [SPARK-10521][SQL] Utilize Docker for test DB2 JDBS Dialect support Jan 22, 2016
@lresende lresende changed the title [SPARK-10521][SQL] Utilize Docker for test DB2 JDBS Dialect support [SPARK-10521][SQL] Utilize Docker for test DB2 JDBC Dialect suppor Jan 22, 2016
@lresende lresende changed the title [SPARK-10521][SQL] Utilize Docker for test DB2 JDBC Dialect suppor [SPARK-10521][SQL] Utilize Docker for test DB2 JDBC Dialect support Jan 22, 2016
@lresende
Copy link
Member Author

@JoshRosen For this PR to work, there is a need to manually deploy the JDBC driver as described on the pom. This is similar to the requirement that in order to run these tests, people must have docker installed.

As for deploying the driver into maven or something, I am trying to get that approved, but for now people can download the driver manually via the link described in the pom.

As for running these, I have ran it on our Jenkins environment, and see the results of the DB2 Integration tests below.

Could you please review this, help with the Jenkins slaves configuration, and merge so we can move forward with the rest of the DB2 dialect prs waiting on this.

Db2IntegrationSuite:
16/01/22 12:26:56 INFO Slf4jLogger: Slf4jLogger started
16/01/22 12:26:56 INFO Remoting: Starting remoting
16/01/22 12:26:56 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@localhost:33460]
16/01/22 12:26:56 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 33460.
16/01/22 12:26:56 INFO SparkEnv: Registering MapOutputTracker
16/01/22 12:26:56 INFO SparkEnv: Registering BlockManagerMaster
16/01/22 12:26:56 INFO DiskBlockManager: Created local directory at /a/workspace/Spark-Build_and_Test_with_docker/docker-integration-tests/target/tmp/blockmgr-73d7995e-9327-46d8-9a4e-3a9b991e2782
16/01/22 12:26:56 INFO MemoryStore: MemoryStore started with capacity 2.0 GB
16/01/22 12:26:56 INFO SparkEnv: Registering OutputCommitCoordinator
16/01/22 12:26:56 INFO Executor: Starting executor ID driver on host localhost
16/01/22 12:26:56 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 36866.
16/01/22 12:26:56 INFO NettyBlockTransferService: Server created on 36866
16/01/22 12:26:56 INFO BlockManagerMaster: Trying to register BlockManager
16/01/22 12:26:56 INFO BlockManagerMasterEndpoint: Registering block manager localhost:36866 with 2.0 GB RAM, BlockManagerId(driver, localhost, 36866)
16/01/22 12:26:56 INFO BlockManagerMaster: Registered BlockManager
16/01/22 12:26:56 INFO DefaultDockerClient: Creating container with ContainerConfig: ContainerConfig{hostname=null, domainname=null, username=null, attachStdin=null, attachStdout=null, attachStderr=null, portSpecs=null, exposedPorts=[50000/tcp], tty=null, openStdin=null, stdinOnce=null, env=[DB2INST1_PASSWORD=rootpass, LICENSE=accept], cmd=[db2start], image=lresende/db2express-c:10.5.0.5-3.10.0, volumes=null, workingDir=null, entrypoint=null, networkDisabled=false, onBuild=null, labels=null, macAddress=null, hostConfig=HostConfig{binds=null, containerIDFile=null, lxcConf=null, privileged=null, portBindings={50000/tcp=[PortBinding{hostIp=9.30.122.152, hostPort=51408}]}, links=null, publishAllPorts=null, dns=null, dnsSearch=null, extraHosts=null, volumesFrom=null, networkMode=bridge, securityOpt=null, memory=null, memorySwap=null, cpuShares=null, cpusetCpus=null, cpuQuota=null, cgroupParent=null}}

16/01/22 12:26:58 INFO DefaultDockerClient: Starting container with Id: bc6620ff6e3c41521e48752164ded0290e4d97c30ce1c30d68436fa45ea6863c

16/01/22 12:27:05 INFO Db2IntegrationSuite:

===== TEST OUTPUT FOR o.a.s.sql.jdbc.Db2IntegrationSuite: 'Basic test' =====

16/01/22 12:27:05 INFO SparkContext: Starting job: apply at Transformer.scala:22
16/01/22 12:27:05 INFO DAGScheduler: Got job 0 (apply at Transformer.scala:22) with 1 output partitions
16/01/22 12:27:05 INFO DAGScheduler: Final stage: ResultStage 0 (apply at Transformer.scala:22)
16/01/22 12:27:05 INFO DAGScheduler: Parents of final stage: List()
16/01/22 12:27:05 INFO DAGScheduler: Missing parents: List()
16/01/22 12:27:05 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at apply at Transformer.scala:22), which has no missing parents
16/01/22 12:27:05 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 6.6 KB, free 6.6 KB)
16/01/22 12:27:05 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 3.3 KB, free 9.9 KB)
16/01/22 12:27:05 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:36866 (size: 3.3 KB, free: 2.0 GB)
16/01/22 12:27:05 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1010
16/01/22 12:27:05 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at apply at Transformer.scala:22)
16/01/22 12:27:05 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
16/01/22 12:27:05 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0,PROCESS_LOCAL, 1972 bytes)
16/01/22 12:27:05 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
16/01/22 12:27:05 INFO JDBCRDD: closed connection
16/01/22 12:27:05 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1476 bytes result sent to driver
16/01/22 12:27:05 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 289 ms on localhost (1/1)
16/01/22 12:27:05 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
16/01/22 12:27:05 INFO DAGScheduler: ResultStage 0 (apply at Transformer.scala:22) finished in 0.290 s
16/01/22 12:27:05 INFO DAGScheduler: Job 0 finished: apply at Transformer.scala:22, took 0.299543 s
16/01/22 12:27:05 INFO Db2IntegrationSuite:

===== FINISHED o.a.s.sql.jdbc.Db2IntegrationSuite: 'Basic test' =====

16/01/22 12:27:05 INFO Db2IntegrationSuite:

===== TEST OUTPUT FOR o.a.s.sql.jdbc.Db2IntegrationSuite: 'Numeric types' =====

  • Basic test
    16/01/22 12:27:06 INFO SparkContext: Starting job: apply at Transformer.scala:22
    16/01/22 12:27:06 INFO DAGScheduler: Got job 1 (apply at Transformer.scala:22) with 1 output partitions
    16/01/22 12:27:06 INFO DAGScheduler: Final stage: ResultStage 1 (apply at Transformer.scala:22)
    16/01/22 12:27:06 INFO DAGScheduler: Parents of final stage: List()
    16/01/22 12:27:06 INFO DAGScheduler: Missing parents: List()
    16/01/22 12:27:06 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at apply at Transformer.scala:22), which has no missing parents
    16/01/22 12:27:06 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 7.4 KB, free 17.3 KB)
    16/01/22 12:27:06 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.7 KB, free 21.0 KB)
    16/01/22 12:27:06 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:36866 (size: 3.7 KB, free: 2.0 GB)
    16/01/22 12:27:06 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1010
    16/01/22 12:27:06 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at apply at Transformer.scala:22)
    16/01/22 12:27:06 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
    16/01/22 12:27:06 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, partition 0,PROCESS_LOCAL, 1972 bytes)
    16/01/22 12:27:06 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)

16/01/22 12:27:06 INFO CodeGenerator: Code generated in 12.625054 ms
16/01/22 12:27:06 INFO JDBCRDD: closed connection
16/01/22 12:27:06 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1467 bytes result sent to driver
16/01/22 12:27:06 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 392 ms on localhost (1/1)
16/01/22 12:27:06 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool
16/01/22 12:27:06 INFO DAGScheduler: ResultStage 1 (apply at Transformer.scala:22) finished in 0.393 s
16/01/22 12:27:06 INFO DAGScheduler: Job 1 finished: apply at Transformer.scala:22, took 0.400140 s
16/01/22 12:27:06 INFO Db2IntegrationSuite:

===== FINISHED o.a.s.sql.jdbc.Db2IntegrationSuite: 'Numeric types' =====

  • Numeric types
    16/01/22 12:27:06 INFO Db2IntegrationSuite:

===== TEST OUTPUT FOR o.a.s.sql.jdbc.Db2IntegrationSuite: 'Date types' =====

16/01/22 12:27:06 INFO SparkContext: Starting job: apply at Transformer.scala:22
16/01/22 12:27:06 INFO DAGScheduler: Got job 2 (apply at Transformer.scala:22) with 1 output partitions
16/01/22 12:27:06 INFO DAGScheduler: Final stage: ResultStage 2 (apply at Transformer.scala:22)
16/01/22 12:27:06 INFO DAGScheduler: Parents of final stage: List()
16/01/22 12:27:06 INFO DAGScheduler: Missing parents: List()
16/01/22 12:27:06 INFO DAGScheduler: Submitting ResultStage 2 (MapPartitionsRDD[8] at apply at Transformer.scala:22), which has no missing parents
16/01/22 12:27:06 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 6.7 KB, free 27.6 KB)
16/01/22 12:27:06 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 3.4 KB, free 31.0 KB)
16/01/22 12:27:06 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on localhost:36866 (size: 3.4 KB, free: 2.0 GB)
16/01/22 12:27:06 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1010
16/01/22 12:27:06 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (MapPartitionsRDD[8] at apply at Transformer.scala:22)
16/01/22 12:27:06 INFO TaskSchedulerImpl: Adding task set 2.0 with 1 tasks
16/01/22 12:27:06 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 2, localhost, partition 0,PROCESS_LOCAL, 1972 bytes)
16/01/22 12:27:06 INFO Executor: Running task 0.0 in stage 2.0 (TID 2)

16/01/22 12:27:07 INFO CodeGenerator: Code generated in 11.399165 ms
16/01/22 12:27:07 INFO JDBCRDD: closed connection
16/01/22 12:27:07 INFO Executor: Finished task 0.0 in stage 2.0 (TID 2). 1427 bytes result sent to driver
16/01/22 12:27:07 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID 2) in 315 ms on localhost (1/1)
16/01/22 12:27:07 INFO DAGScheduler: ResultStage 2 (apply at Transformer.scala:22) finished in 0.315 s
16/01/22 12:27:07 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool
16/01/22 12:27:07 INFO DAGScheduler: Job 2 finished: apply at Transformer.scala:22, took 0.321779 s
16/01/22 12:27:07 INFO Db2IntegrationSuite:

===== FINISHED o.a.s.sql.jdbc.Db2IntegrationSuite: 'Date types' =====

  • Date types
    16/01/22 12:27:07 INFO Db2IntegrationSuite:

===== TEST OUTPUT FOR o.a.s.sql.jdbc.Db2IntegrationSuite: 'String types' =====

16/01/22 12:27:07 INFO SparkContext: Starting job: apply at Transformer.scala:22
16/01/22 12:27:07 INFO DAGScheduler: Got job 3 (apply at Transformer.scala:22) with 1 output partitions
16/01/22 12:27:07 INFO DAGScheduler: Final stage: ResultStage 3 (apply at Transformer.scala:22)
16/01/22 12:27:07 INFO DAGScheduler: Parents of final stage: List()
16/01/22 12:27:07 INFO DAGScheduler: Missing parents: List()
16/01/22 12:27:07 INFO DAGScheduler: Submitting ResultStage 3 (MapPartitionsRDD[11] at apply at Transformer.scala:22), which has no missing parents
16/01/22 12:27:07 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 6.7 KB, free 37.7 KB)
16/01/22 12:27:07 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 3.4 KB, free 41.1 KB)
16/01/22 12:27:07 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory on localhost:36866 (size: 3.4 KB, free: 2.0 GB)
16/01/22 12:27:07 INFO SparkContext: Created broadcast 3 from broadcast at DAGScheduler.scala:1010
16/01/22 12:27:07 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 3 (MapPartitionsRDD[11] at apply at Transformer.scala:22)
16/01/22 12:27:07 INFO TaskSchedulerImpl: Adding task set 3.0 with 1 tasks
16/01/22 12:27:07 INFO TaskSetManager: Starting task 0.0 in stage 3.0 (TID 3, localhost, partition 0,PROCESS_LOCAL, 1972 bytes)
16/01/22 12:27:07 INFO Executor: Running task 0.0 in stage 3.0 (TID 3)
16/01/22 12:27:07 INFO CodeGenerator: Code generated in 8.298278 ms
16/01/22 12:27:07 INFO JDBCRDD: closed connection
16/01/22 12:27:07 INFO Executor: Finished task 0.0 in stage 3.0 (TID 3). 1475 bytes result sent to driver
16/01/22 12:27:07 INFO TaskSetManager: Finished task 0.0 in stage 3.0 (TID 3) in 360 ms on localhost (1/1)
16/01/22 12:27:07 INFO TaskSchedulerImpl: Removed TaskSet 3.0, whose tasks have all completed, from pool
16/01/22 12:27:07 INFO DAGScheduler: ResultStage 3 (apply at Transformer.scala:22) finished in 0.361 s
16/01/22 12:27:07 INFO DAGScheduler: Job 3 finished: apply at Transformer.scala:22, took 0.368710 s
16/01/22 12:27:07 INFO Db2IntegrationSuite:

===== FINISHED o.a.s.sql.jdbc.Db2IntegrationSuite: 'String types' =====

  • String types
    16/01/22 12:27:07 INFO Db2IntegrationSuite:

===== TEST OUTPUT FOR o.a.s.sql.jdbc.Db2IntegrationSuite: 'Basic write test' =====

16/01/22 12:27:09 INFO SparkContext: Starting job: apply at Transformer.scala:22
16/01/22 12:27:09 INFO DAGScheduler: Got job 4 (apply at Transformer.scala:22) with 1 output partitions
16/01/22 12:27:09 INFO DAGScheduler: Final stage: ResultStage 4 (apply at Transformer.scala:22)
16/01/22 12:27:09 INFO DAGScheduler: Parents of final stage: List()
16/01/22 12:27:09 INFO DAGScheduler: Missing parents: List()
16/01/22 12:27:09 INFO DAGScheduler: Submitting ResultStage 4 (MapPartitionsRDD[14] at apply at Transformer.scala:22), which has no missing parents
16/01/22 12:27:09 INFO MemoryStore: Block broadcast_4 stored as values in memory (estimated size 7.1 KB, free 48.2 KB)
16/01/22 12:27:09 INFO MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 3.6 KB, free 51.9 KB)
16/01/22 12:27:09 INFO BlockManagerInfo: Added broadcast_4_piece0 in memory on localhost:36866 (size: 3.6 KB, free: 2.0 GB)
16/01/22 12:27:09 INFO SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:1010
16/01/22 12:27:09 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 4 (MapPartitionsRDD[14] at apply at Transformer.scala:22)
16/01/22 12:27:09 INFO TaskSchedulerImpl: Adding task set 4.0 with 1 tasks
16/01/22 12:27:09 INFO TaskSetManager: Starting task 0.0 in stage 4.0 (TID 4, localhost, partition 0,PROCESS_LOCAL, 1972 bytes)
16/01/22 12:27:09 INFO Executor: Running task 0.0 in stage 4.0 (TID 4)
16/01/22 12:27:09 INFO JDBCRDD: closed connection
16/01/22 12:27:09 INFO Executor: Finished task 0.0 in stage 4.0 (TID 4). 1165 bytes result sent to driver
16/01/22 12:27:09 INFO TaskSetManager: Finished task 0.0 in stage 4.0 (TID 4) in 387 ms on localhost (1/1)
16/01/22 12:27:09 INFO TaskSchedulerImpl: Removed TaskSet 4.0, whose tasks have all completed, from pool
16/01/22 12:27:09 INFO DAGScheduler: ResultStage 4 (apply at Transformer.scala:22) finished in 0.388 s
16/01/22 12:27:09 INFO DAGScheduler: Job 4 finished: apply at Transformer.scala:22, took 0.394228 s

16/01/22 12:27:10 INFO SparkContext: Starting job: apply at Transformer.scala:22
16/01/22 12:27:10 INFO DAGScheduler: Got job 5 (apply at Transformer.scala:22) with 1 output partitions
16/01/22 12:27:10 INFO DAGScheduler: Final stage: ResultStage 5 (apply at Transformer.scala:22)
16/01/22 12:27:10 INFO DAGScheduler: Parents of final stage: List()
16/01/22 12:27:10 INFO DAGScheduler: Missing parents: List()
16/01/22 12:27:10 INFO DAGScheduler: Submitting ResultStage 5 (MapPartitionsRDD[17] at apply at Transformer.scala:22), which has no missing parents
16/01/22 12:27:10 INFO MemoryStore: Block broadcast_5 stored as values in memory (estimated size 7.2 KB, free 59.1 KB)
16/01/22 12:27:10 INFO MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 3.7 KB, free 62.8 KB)
16/01/22 12:27:10 INFO BlockManagerInfo: Added broadcast_5_piece0 in memory on localhost:36866 (size: 3.7 KB, free: 2.0 GB)
16/01/22 12:27:10 INFO SparkContext: Created broadcast 5 from broadcast at DAGScheduler.scala:1010
16/01/22 12:27:10 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 5 (MapPartitionsRDD[17] at apply at Transformer.scala:22)
16/01/22 12:27:10 INFO TaskSchedulerImpl: Adding task set 5.0 with 1 tasks
16/01/22 12:27:10 INFO TaskSetManager: Starting task 0.0 in stage 5.0 (TID 5, localhost, partition 0,PROCESS_LOCAL, 1972 bytes)
16/01/22 12:27:10 INFO Executor: Running task 0.0 in stage 5.0 (TID 5)
16/01/22 12:27:10 INFO JDBCRDD: closed connection
16/01/22 12:27:10 INFO Executor: Finished task 0.0 in stage 5.0 (TID 5). 1165 bytes result sent to driver
16/01/22 12:27:10 INFO TaskSetManager: Finished task 0.0 in stage 5.0 (TID 5) in 433 ms on localhost (1/1)
16/01/22 12:27:10 INFO TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool
16/01/22 12:27:10 INFO DAGScheduler: ResultStage 5 (apply at Transformer.scala:22) finished in 0.435 s
16/01/22 12:27:10 INFO DAGScheduler: Job 5 finished: apply at Transformer.scala:22, took 0.441109 s
16/01/22 12:27:10 INFO Db2IntegrationSuite:

===== FINISHED o.a.s.sql.jdbc.Db2IntegrationSuite: 'Basic write test' =====

  • Basic write test

16/01/22 12:27:16 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/01/22 12:27:16 INFO MemoryStore: MemoryStore cleared
16/01/22 12:27:16 INFO BlockManager: BlockManager stopped
16/01/22 12:27:16 INFO BlockManagerMaster: BlockManagerMaster stopped
16/01/22 12:27:16 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/01/22 12:27:16 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/01/22 12:27:16 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/01/22 12:27:16 INFO SparkContext: Successfully stopped SparkContext
16/01/22 12:27:16 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
Run completed in 58 seconds, 288 milliseconds.
Total number of tests run: 12
Suites: completed 4, aborted 0
Tests: succeeded 12, failed 0, canceled 0, ignored 0, pending 0
All tests passed.

@lresende
Copy link
Member Author

Jenkins, this is ok to test.

@JoshRosen
Copy link
Contributor

Getting this installed cluster-wide is going to be non-trivial: our CI process has separate Ivy caches for every build workspace, so the installation of the JDBC driver JAR isn't a one-time process but needs to be automated. Right now, we have the ability to just wipe out a build workspace in case its caches get corrupted, since everything will be reconstituted by fetching from Maven Central.

As a result, I'd like to hold off on merging this until we get that driver JAR published to a public Maven repository or have some other means to allow SBT and Maven to automatically obtain it. One alternative approach would be to just check the DB2 JDBC JAR itself into the Spark repository; this isn't ideal, but AFAIK it's technically okay for us to have binary artifacts in our repo as long as their only for testing. I'm not sure whether IBM's licensing terms would permit this, though.

@lresende
Copy link
Member Author

Jenkins, this is ok to test.

@SparkQA
Copy link

SparkQA commented Jan 27, 2016

Test build #50214 has finished for PR 9893 at commit 7dbf310.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • assert(types(0).equals(\"class java.lang.Integer\"))
    • assert(types(1).equals(\"class java.lang.Integer\"))
    • assert(types(2).equals(\"class java.lang.Long\"))
    • assert(types(3).equals(\"class java.math.BigDecimal\"))
    • assert(types(4).equals(\"class java.lang.Double\"))
    • assert(types(5).equals(\"class java.lang.Double\"))
    • assert(types(3).equals(\"class [B\"))

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This this repository going to stay around indefinitely? I don't like depending on things which aren't in Maven Central since we've been bitten in the past by repositories changing artifacts, disappearing, or having uptime problems.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be good for now. In parallel, I am working on getting an official ibm maven repo with these drivers, but it's a bureaucratic process.

@SparkQA
Copy link

SparkQA commented Jan 27, 2016

Test build #50222 has finished for PR 9893 at commit 2374ee4.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • assert(types(0).equals(\"class java.lang.Integer\"))
    • assert(types(1).equals(\"class java.lang.Integer\"))
    • assert(types(2).equals(\"class java.lang.Long\"))
    • assert(types(3).equals(\"class java.math.BigDecimal\"))
    • assert(types(4).equals(\"class java.lang.Double\"))
    • assert(types(5).equals(\"class java.lang.Double\"))
    • assert(types(3).equals(\"class [B\"))

@SparkQA
Copy link

SparkQA commented Jan 28, 2016

Test build #50247 has finished for PR 9893 at commit ab6d601.

  • This patch fails build dependency tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jan 28, 2016

Test build #50249 has finished for PR 9893 at commit 75cb0f6.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 9, 2016

Test build #55408 has finished for PR 9893 at commit abe4a33.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

lresende added 13 commits April 8, 2016 18:05
The pom.xml was also updated with instructions on
how to install the drivers on a local repository
or directly into your own local cached maven.
Follow same password pattern as other db dialects to avoid
warning about password containing parts of the username
DB2 requires a DB to be created via DB2 tools before
a JDBC connectiion can be established
Upgrade docker-client to latest version that has support for passing
ipcMode configuration to docker which will fix DB2 shared memory issues.
…machine

The AMPLab jenkins slaves are not using the recommended level
of Linux kernel and this is causing issues trying to use shared
memory thus failing to start DB2 server
@SparkQA
Copy link

SparkQA commented Apr 9, 2016

Test build #55409 has finished for PR 9893 at commit 9c87286.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@lresende
Copy link
Member Author

lresende commented Apr 9, 2016

Jenkins, retest this please.

@lresende
Copy link
Member Author

lresende commented Apr 9, 2016

@JoshRosen, thanks for reviewing, I agree with your proposal regarding the repository. Thanks.

@SparkQA
Copy link

SparkQA commented Apr 9, 2016

Test build #55410 has finished for PR 9893 at commit 9c87286.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 9, 2016

Test build #55411 has finished for PR 9893 at commit 31e1d8b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@lresende
Copy link
Member Author

lresende commented Apr 9, 2016

@JoshRosen All good, style regression fixed, all tests passing.

@JoshRosen
Copy link
Contributor

Jenkins, retest this please.

@JoshRosen
Copy link
Contributor

Looks like this compiled and fetched dependencies properly, so I'm going to merge to master. Let's remember to re-enable these tests once we upgrade Jenkins' kernel.

@asfgit asfgit closed this in 94de630 Apr 11, 2016
@shaneknapp
Copy link
Contributor

sounds good. i'm meeting w/the mesosphere guys thursday and this is a
topic i'm planning on bringing up.

On Mon, Apr 11, 2016 at 4:41 PM, Josh Rosen [email protected]
wrote:

Looks like this compiled and fetched dependencies properly, so I'm going
to merge to master. Let's remember to re-enable these tests once we upgrade
Jenkins' kernel.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#9893 (comment)

@lresende lresende deleted the SPARK-10521 branch April 12, 2016 00:10
@SparkQA
Copy link

SparkQA commented Apr 12, 2016

Test build #55542 has finished for PR 9893 at commit 31e1d8b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants