- 
                Notifications
    You must be signed in to change notification settings 
- Fork 28.9k
[SPARK-6136] [SQL] Removed JDBC integration tests which depends on docker-client #4872
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| Test build #28229 has started for   PR 4872 at commit  
 | 
| Test build #28229 has finished for   PR 4872 at commit  
 | 
| Test FAILed. | 
| Test build #28230 has started for   PR 4872 at commit  
 | 
| Isn't it a little extreme to remove the tests? what about just excluding the Guava dep so that 14.0 is used? it may just work | 
| Test build #28230 has finished for   PR 4872 at commit  
 | 
| Test PASSed. | 
| @srowen These two test suites do a bunch of time consuming jobs like setting up docker containers, pulling in MySQL/PostgreSQL, etc., and are thus ignored from the very beginning. (You may see the  | 
| /cc @marmbrus Just to double check, is it OK to remove them? | 
| @liancheng Ah I looked right past the  | 
| +1 to moving these to integration tests, especially if they are causing build problems. Can you make sure there is a JIRA somewhere (where?) to make sure we don't forget though. These tests actually did find all kinds of quirks with the various drivers. | 
| @marmbrus @srowen Filed SPARK-6147 to track this. Thanks for the comments! | 
…cker-client Integration test suites in the JDBC data source (`MySQLIntegration` and `PostgresIntegration`) depend on docker-client 2.7.5, which transitively depends on Guava 17.0. Unfortunately, Guava 17.0 is causing test runtime binary compatibility issues when Spark is compiled against Hive 0.12.0, or Hadoop 2.4. Considering `MySQLIntegration` and `PostgresIntegration` are ignored right now, I'd suggest moving them from the Spark project to the [Spark integration tests] [1] project. This PR removes both the JDBC data source integration tests and the docker-client test dependency. [1]: |https://github.com/databricks/spark-integration-tests <!-- Reviewable:start --> [<img src="https://reviewable.io/review_button.png" height=40 alt="Review on Reviewable"/>](https://reviewable.io/reviews/apache/spark/4872) <!-- Reviewable:end --> Author: Cheng Lian <[email protected]> Closes #4872 from liancheng/remove-docker-client and squashes the following commits: 1f4169e [Cheng Lian] Removes DockerHacks 159b24a [Cheng Lian] Removed JDBC integration tests which depends on docker-client (cherry picked from commit 76b472f) Signed-off-by: Cheng Lian <[email protected]>
This patch re-enables tests for the Docker JDBC data source. These tests were reverted in #4872 due to transitive dependency conflicts introduced by the `docker-client` library. This patch should avoid those problems by using a version of `docker-client` which shades its transitive dependencies and by performing some build-magic to work around problems with that shaded JAR. In addition, I significantly refactored the tests to simplify the setup and teardown code and to fix several Docker networking issues which caused problems when running in `boot2docker`. Closes #8101. Author: Josh Rosen <[email protected]> Author: Yijie Shen <[email protected]> Closes #9503 from JoshRosen/docker-jdbc-tests. (cherry picked from commit 1dde39d) Signed-off-by: Reynold Xin <[email protected]>
This patch re-enables tests for the Docker JDBC data source. These tests were reverted in #4872 due to transitive dependency conflicts introduced by the `docker-client` library. This patch should avoid those problems by using a version of `docker-client` which shades its transitive dependencies and by performing some build-magic to work around problems with that shaded JAR. In addition, I significantly refactored the tests to simplify the setup and teardown code and to fix several Docker networking issues which caused problems when running in `boot2docker`. Closes #8101. Author: Josh Rosen <[email protected]> Author: Yijie Shen <[email protected]> Closes #9503 from JoshRosen/docker-jdbc-tests.
Integration test suites in the JDBC data source (
MySQLIntegrationandPostgresIntegration) depend on docker-client 2.7.5, which transitively depends on Guava 17.0. Unfortunately, Guava 17.0 is causing test runtime binary compatibility issues when Spark is compiled against Hive 0.12.0, or Hadoop 2.4.Considering
MySQLIntegrationandPostgresIntegrationare ignored right now, I'd suggest moving them from the Spark project to the Spark integration tests project. This PR removes both the JDBC data source integration tests and the docker-client test dependency.