Skip to content

Conversation

@steveloughran
Copy link
Contributor

How was this patch tested?

Testing in progress; still trying to get the ITests working.

JUnit5 update complicates things here, as it highlights that minicluster tests aren't working.

For code changes:

  • Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')?
  • Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation?
  • If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under ASF 2.0?
  • If applicable, have you updated the LICENSE, LICENSE-binary, NOTICE-binary files?

@pan3793
Copy link
Member

pan3793 commented Aug 19, 2025

JUnit5 update complicates things here, as it highlights that minicluster tests aren't working.

I found hadoop-client-runtime and hadoop-client-minicluster broken during integration with Spark, HADOOP-19652 plus YARN-11824 recovers that, is it the same issue?

@steveloughran
Copy link
Contributor Author

@pan3793 maybe.

what is unrelated is out the box the SDK doesn't do bulk delete with third party stores which support it (Dell ECS).

org.apache.hadoop.fs.s3a.AWSBadRequestException: bulkDelete on job-00-fork-0001/test/org.apache.hadoop.fs.contract.s3a.ITestS3AContractBulkDelete: software.amazon.awssdk.services.s3.model.InvalidRequestException: Missing required header for this request: Content-MD5 (Service: S3, Status Code: 400, Request ID: 0c07c87d:196d43d824a:d5329:91d, Extended Request ID: 85e1d41b57b608d4e58222b552dea52902e93b05a12f63f54730ae77769df8d1) (SDK Attempt Count: 1):InvalidRequest: Missing required header for this request: Content-MD5 (Service: S3, Status Code: 400, Request ID: 0c07c87d:196d43d824a:d5329:91d, Extended Request ID: 85e1d41b57b608d4e58222b552dea52902e93b05a12f63f54730ae77769df8d1) (SDK Attempt Count: 1)
--

@steveloughran
Copy link
Contributor Author

@pan3793 no, it's lifecycle related. Test needs to set up that minicluster before the test cases. and that's somehow not happening

@steveloughran steveloughran force-pushed the s3/HADOOP-19654-aws-sdk-2.32 branch from 5b9a7e3 to efd34a0 Compare August 25, 2025 21:36
@steveloughran steveloughran marked this pull request as draft August 25, 2025 21:37
@steveloughran
Copy link
Contributor Author

regressions

everywhere

No logging. Instead we get

SLF4J: Failed to load class "org.slf4j.impl.StaticMDCBinder".
SLF4J: Defaulting to no-operation MDCAdapter implementation.
SLF4J: See http://www.slf4j.org/codes.html#no_static_mdc_binder for further details.

ITestS3AContractAnalyticsStreamVectoredRead failures -stream closed.

more on this once I've looked at it. If it is an SDK issue, major regression, though it may be something needing changes in the aal libary

s3 express

[ERROR]   ITestTreewalkProblems.testDistCp:319->lambda$testDistCp$3:320 [Exit code of distcp -useiterator -update -delete -direct s3a://stevel--usw2-az1--x-s3/job-00-fork-0005/test/testDistCp/src s3a://stevel--usw2-az1--x-s3/job-00-fork-0005/test/testDistCp/dest]    

assumption: now that the store has lifecycle rules, you don't get prefix listings when there's an in-progress upload.

Fix: change test but also path capability warning of inconsistency. this is good.

Operation costs/auditing count an extra HTTP request, so cost tests fail. I suspect it is always calling CreateSession, but without logging can't be sure

@steveloughran steveloughran force-pushed the s3/HADOOP-19654-aws-sdk-2.32 branch from efd34a0 to 6a7e6d9 Compare August 26, 2025 20:52
@steveloughran steveloughran force-pushed the s3/HADOOP-19654-aws-sdk-2.32 branch from 6a7e6d9 to cc31e5b Compare September 15, 2025 17:47
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 20s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+0 🆗 xmllint 0m 1s xmllint was not available.
+0 🆗 markdownlint 0m 1s markdownlint was not available.
+0 🆗 shelldocs 0m 1s Shelldocs was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 10 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 10m 31s Maven dependency ordering for branch
+1 💚 mvninstall 24m 17s trunk passed
+1 💚 compile 9m 23s trunk passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 compile 7m 50s trunk passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+1 💚 checkstyle 2m 0s trunk passed
+1 💚 mvnsite 19m 57s trunk passed
+1 💚 javadoc 5m 17s trunk passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 javadoc 4m 37s trunk passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+0 🆗 spotbugs 0m 11s branch/hadoop-project no spotbugs output file (spotbugsXml.xml)
+1 💚 shadedclient 40m 37s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 40s Maven dependency ordering for patch
+1 💚 mvninstall 23m 52s the patch passed
+1 💚 compile 8m 3s the patch passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 javac 8m 3s the patch passed
+1 💚 compile 7m 24s the patch passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+1 💚 javac 7m 24s the patch passed
-1 ❌ blanks 0m 0s /blanks-eol.txt The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <<patch_file>>. Refer https://git-scm.com/docs/git-apply
-0 ⚠️ checkstyle 1m 54s /results-checkstyle-root.txt root: The patch generated 9 new + 42 unchanged - 5 fixed = 51 total (was 47)
+1 💚 mvnsite 11m 32s the patch passed
+1 💚 shellcheck 0m 0s No new issues.
+1 💚 javadoc 5m 26s the patch passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 javadoc 5m 7s the patch passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+0 🆗 spotbugs 0m 15s hadoop-project has no data from spotbugs
+1 💚 shadedclient 39m 23s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 678m 20s /patch-unit-root.txt root in the patch passed.
+1 💚 asflicense 1m 8s The patch does not generate ASF License warnings.
913m 40s
Reason Tests
Failed junit tests hadoop.yarn.server.router.subcluster.fair.TestYarnFederationWithFairScheduler
hadoop.yarn.server.router.webapp.TestFederationWebApp
hadoop.yarn.server.router.webapp.TestRouterWebServicesREST
hadoop.mapreduce.v2.TestUberAM
hadoop.yarn.sls.appmaster.TestAMSimulator
Subsystem Report/Notes
Docker ClientAPI=1.51 ServerAPI=1.51 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/6/artifact/out/Dockerfile
GITHUB PR #7882
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle markdownlint shellcheck shelldocs
uname Linux 113d355d9ed2 5.15.0-143-generic #153-Ubuntu SMP Fri Jun 13 19:10:45 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / cc31e5b
Default Java Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/6/testReport/
Max. process+thread count 4200 (vs. ulimit of 5500)
modules C: hadoop-project hadoop-tools/hadoop-aws . U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/6/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2 shellcheck=0.7.0
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@ahmarsuhail
Copy link
Contributor

Thanks @steveloughran, PR looks good overall.

Are then failures in ITestS3AContractAnalyticsStreamVectoredRead intermittent? I've not been able to reproduce, am running the test on this SDK upgrade branch.

// disable create session so there's no need to
// add a role policy for it.
disableCreateSession(conf);
//disableCreateSession(conf);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: can just cut this instead of commenting it out, since we're skipping these tests if S3 Express is enabled

// close the stream, should throw RemoteFileChangedException
RemoteFileChangedException exception = intercept(RemoteFileChangedException.class, stream::close);
assertS3ExceptionStatusCode(SC_412_PRECONDITION_FAILED, exception);
verifyS3ExceptionStatusCode(SC_412_PRECONDITION_FAILED, exception);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you know what the difference is with the other tests here?

As in, why with S3 express is it ok to assert that we'll get a 412, whereas the others tests will throw a 200?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey, it's your server code. Go see.

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 20s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+0 🆗 shelldocs 0m 0s Shelldocs was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 10 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 10m 32s Maven dependency ordering for branch
+1 💚 mvninstall 23m 50s trunk passed
+1 💚 compile 8m 32s trunk passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 compile 7m 30s trunk passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+1 💚 checkstyle 1m 58s trunk passed
+1 💚 mvnsite 14m 30s trunk passed
+1 💚 javadoc 5m 33s trunk passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 javadoc 5m 5s trunk passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+0 🆗 spotbugs 0m 15s branch/hadoop-project no spotbugs output file (spotbugsXml.xml)
+1 💚 shadedclient 38m 32s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 36s Maven dependency ordering for patch
+1 💚 mvninstall 23m 26s the patch passed
+1 💚 compile 8m 17s the patch passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 javac 8m 17s the patch passed
+1 💚 compile 7m 15s the patch passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+1 💚 javac 7m 15s the patch passed
-1 ❌ blanks 0m 0s /blanks-eol.txt The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <<patch_file>>. Refer https://git-scm.com/docs/git-apply
-0 ⚠️ checkstyle 1m 58s /results-checkstyle-root.txt root: The patch generated 9 new + 42 unchanged - 5 fixed = 51 total (was 47)
+1 💚 mvnsite 12m 9s the patch passed
+1 💚 shellcheck 0m 0s No new issues.
+1 💚 javadoc 5m 27s the patch passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 javadoc 5m 4s the patch passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+0 🆗 spotbugs 0m 15s hadoop-project has no data from spotbugs
+1 💚 shadedclient 38m 17s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 678m 56s /patch-unit-root.txt root in the patch passed.
+1 💚 asflicense 1m 11s The patch does not generate ASF License warnings.
905m 0s
Reason Tests
Failed junit tests hadoop.yarn.server.router.subcluster.fair.TestYarnFederationWithFairScheduler
hadoop.yarn.server.router.webapp.TestFederationWebApp
hadoop.yarn.server.router.webapp.TestRouterWebServicesREST
hadoop.mapreduce.v2.TestUberAM
hadoop.yarn.sls.appmaster.TestAMSimulator
Subsystem Report/Notes
Docker ClientAPI=1.51 ServerAPI=1.51 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/7/artifact/out/Dockerfile
GITHUB PR #7882
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle markdownlint shellcheck shelldocs
uname Linux 3b890eb50412 5.15.0-143-generic #153-Ubuntu SMP Fri Jun 13 19:10:45 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 3351e41
Default Java Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/7/testReport/
Max. process+thread count 4379 (vs. ulimit of 5500)
modules C: hadoop-project hadoop-tools/hadoop-aws . U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/7/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2 shellcheck=0.7.0
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@apache apache deleted a comment from hadoop-yetus Sep 17, 2025
@apache apache deleted a comment from hadoop-yetus Sep 17, 2025
@steveloughran
Copy link
Contributor Author

I've attached a log of a test run against an s3 express bucket where the test ITestAWSStatisticCollection.testSDKMetricsCostOfGetFileStatusOnFile() is failing because the AWS SDK stats report 2 http requests for the probe. I'd thought it was create-session related but it isn't: it looks like somehow the stream is broken. This happens reliably on every test runs.

The relevant stuff is at line 564 where a HEAD request fails because the stream is broken
"end of stream".

2025-09-17 18:43:49,313 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "HEAD /test/testSDKMetricsCostOfGetFileStatusOnFile HTTP/1.1[\r][\n]"
2025-09-17 18:43:49,313 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "Host: stevel--usw2-az1--x-s3.s3express-usw2-az1.us-west-2.amazonaws.com[\r][\n]"
2025-09-17 18:43:49,313 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "amz-sdk-invocation-id: 1804bbcd-04de-cba8-8055-6a09917ca20d[\r][\n]"
2025-09-17 18:43:49,313 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "amz-sdk-request: attempt=1; max=3[\r][\n]"
2025-09-17 18:43:49,313 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "Authorization: AWS4-HMAC-SHA256 Credential=AKIA/20250917/us-west-2/s3express/aws4_request, SignedHeaders=amz-sdk-invocation-id;amz-sdk-request;host;referer;x-amz-content-sha256;x-amz-date, Signature=228a46bb1d008468d38afd0da0ed7b4c354ab12631a63bf4283cb23dc02527a3[\r][\n]"
2025-09-17 18:43:49,313 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "Referer: https://audit.example.org/hadoop/1/op_get_file_status/cf739331-1f2e-42dd-a5d9-f564d6023a23-00000008/?op=op_get_file_status&p1=test/testSDKMetricsCostOfGetFileStatusOnFile&pr=stevel&ps=282e3c5d-c1bd-4859-94b9-82e77ff225d1&id=cf739331-1f2e-42dd-a5d9-f564d6023a23-00000008&t0=1&fs=cf739331-1f2e-42dd-a5d9-f564d6023a23&t1=1&ts=1758131029311[\r][\n]"
2025-09-17 18:43:49,313 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "User-Agent: Hadoop 3.5.0-SNAPSHOT aws-sdk-java/2.33.8 md/io#sync md/http#Apache ua/2.1 api/S3#2.33.x os/Mac_OS_X#15.6.1 lang/java#17.0.8 md/OpenJDK_64-Bit_Server_VM#17.0.8+7-LTS md/vendor#Amazon.com_Inc. md/en_GB m/F,G hll/cross-region[\r][\n]"
2025-09-17 18:43:49,313 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "x-amz-content-sha256: UNSIGNED-PAYLOAD[\r][\n]"
2025-09-17 18:43:49,314 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "X-Amz-Date: 20250917T174349Z[\r][\n]"
2025-09-17 18:43:49,314 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "Connection: Keep-Alive[\r][\n]"
2025-09-17 18:43:49,314 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-1 >> "[\r][\n]"
2025-09-17 18:43:49,314 [setup] DEBUG http.wire (Wire.java:wire(87)) - http-outgoing-1 << "end of stream"
2025-09-17 18:43:49,314 [setup] DEBUG awssdk.request (LoggerAdapter.java:debug(125)) - Retryable error detected. Will retry in 51ms. Request attempt number 1
software.amazon.awssdk.core.exception.SdkClientException: Unable to execute HTTP request: The target server failed to respond
	at software.amazon.awssdk.core.exception.SdkClientException$BuilderImpl.build(SdkClientException.java:130)
	at software.amazon.awssdk.core.exception.SdkClientException.create(SdkClientException.java:47)

The second request always works.

2025-09-17 18:43:49,672 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "HEAD /test/testSDKMetricsCostOfGetFileStatusOnFile HTTP/1.1[\r][\n]"
2025-09-17 18:43:49,673 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "Host: stevel--usw2-az1--x-s3.s3express-usw2-az1.us-west-2.amazonaws.com[\r][\n]"
2025-09-17 18:43:49,673 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "amz-sdk-invocation-id: 1804bbcd-04de-cba8-8055-6a09917ca20d[\r][\n]"
2025-09-17 18:43:49,673 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "amz-sdk-request: attempt=2; max=3[\r][\n]"
2025-09-17 18:43:49,673 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "Authorization: AWS4-HMAC-SHA256 Credential=AKIA/20250917/us-west-2/s3express/aws4_request, SignedHeaders=amz-sdk-invocation-id;amz-sdk-request;host;referer;x-amz-content-sha256;x-amz-date, Signature=920d981fad319228c969f5df7f5c1a3c7e4d3c0e2f45ff53bba73e6cf47c5871[\r][\n]"
2025-09-17 18:43:49,673 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "Referer: https://audit.example.org/hadoop/1/op_get_file_status/cf739331-1f2e-42dd-a5d9-f564d6023a23-00000008/?op=op_get_file_status&p1=test/testSDKMetricsCostOfGetFileStatusOnFile&pr=stevel&ps=282e3c5d-c1bd-4859-94b9-82e77ff225d1&id=cf739331-1f2e-42dd-a5d9-f564d6023a23-00000008&t0=1&fs=cf739331-1f2e-42dd-a5d9-f564d6023a23&t1=1&ts=1758131029311[\r][\n]"
2025-09-17 18:43:49,673 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "User-Agent: Hadoop 3.5.0-SNAPSHOT aws-sdk-java/2.33.8 md/io#sync md/http#Apache ua/2.1 api/S3#2.33.x os/Mac_OS_X#15.6.1 lang/java#17.0.8 md/OpenJDK_64-Bit_Server_VM#17.0.8+7-LTS md/vendor#Amazon.com_Inc. md/en_GB m/F,G hll/cross-region[\r][\n]"
2025-09-17 18:43:49,673 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "x-amz-content-sha256: UNSIGNED-PAYLOAD[\r][\n]"
2025-09-17 18:43:49,673 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "X-Amz-Date: 20250917T174349Z[\r][\n]"
2025-09-17 18:43:49,674 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "Connection: Keep-Alive[\r][\n]"
2025-09-17 18:43:49,674 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 >> "[\r][\n]"
2025-09-17 18:43:49,859 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "HTTP/1.1 200 OK[\r][\n]"
2025-09-17 18:43:49,859 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "server: AmazonS3[\r][\n]"
2025-09-17 18:43:49,859 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "x-amz-request-id: 01869434dd00019958c6871b05090b3f875a3c90[\r][\n]"
2025-09-17 18:43:49,859 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "x-amz-id-2: 9GqfbNyMyUs6[\r][\n]"
2025-09-17 18:43:49,859 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "etag: "6036aaaf62444466bf0a21cc7518f738"[\r][\n]"
2025-09-17 18:43:49,859 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "accept-ranges: bytes[\r][\n]"
2025-09-17 18:43:49,859 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "last-modified: Wed, 17 Sep 2025 17:43:49 GMT[\r][\n]"
2025-09-17 18:43:49,859 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "x-amz-storage-class: EXPRESS_ONEZONE[\r][\n]"
2025-09-17 18:43:49,859 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "content-type: application/octet-stream[\r][\n]"
2025-09-17 18:43:49,859 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "x-amz-server-side-encryption: AES256[\r][\n]"
2025-09-17 18:43:49,860 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "content-length: 0[\r][\n]"
2025-09-17 18:43:49,860 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "x-amz-expiration: NotImplemented[\r][\n]"
2025-09-17 18:43:49,860 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "date: Wed, 17 Sep 2025 17:43:48 GMT[\r][\n]"
2025-09-17 18:43:49,860 [setup] DEBUG http.wire (Wire.java:wire(73)) - http-outgoing-2 << "[\r][\n]"
2025-09-17 18:43:49,860 [setup] DEBUG awssdk.request (LoggerAdapter.java:debug(105)) - Received successful response: 200, Request ID: 

Either the request is being rejected (why?) or the connection has gone stale. But why should it happen at exactly the same place on every single test run?

org.apache.hadoop.fs.s3a.statistics.ITestAWSStatisticCollection-output.txt

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 34s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+0 🆗 shelldocs 0m 1s Shelldocs was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 11 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 12m 12s Maven dependency ordering for branch
+1 💚 mvninstall 40m 29s trunk passed
+1 💚 compile 15m 48s trunk passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 compile 14m 0s trunk passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+1 💚 checkstyle 4m 18s trunk passed
+1 💚 mvnsite 21m 27s trunk passed
+1 💚 javadoc 9m 42s trunk passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 javadoc 7m 58s trunk passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+0 🆗 spotbugs 0m 21s branch/hadoop-project no spotbugs output file (spotbugsXml.xml)
+1 💚 shadedclient 66m 25s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 1m 3s Maven dependency ordering for patch
+1 💚 mvninstall 40m 59s the patch passed
+1 💚 compile 15m 18s the patch passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 javac 15m 18s the patch passed
+1 💚 compile 13m 50s the patch passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+1 💚 javac 13m 50s the patch passed
-1 ❌ blanks 0m 1s /blanks-eol.txt The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <<patch_file>>. Refer https://git-scm.com/docs/git-apply
-0 ⚠️ checkstyle 4m 10s /results-checkstyle-root.txt root: The patch generated 7 new + 42 unchanged - 5 fixed = 49 total (was 47)
+1 💚 mvnsite 19m 25s the patch passed
+1 💚 shellcheck 0m 0s No new issues.
+1 💚 javadoc 9m 38s the patch passed with JDK Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04
+1 💚 javadoc 7m 50s the patch passed with JDK Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
+0 🆗 spotbugs 0m 21s hadoop-project has no data from spotbugs
+1 💚 shadedclient 66m 26s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 450m 14s /patch-unit-root.txt root in the patch failed.
+1 💚 asflicense 1m 21s The patch does not generate ASF License warnings.
832m 28s
Subsystem Report/Notes
Docker ClientAPI=1.51 ServerAPI=1.51 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/8/artifact/out/Dockerfile
GITHUB PR #7882
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle markdownlint shellcheck shelldocs
uname Linux 40fa101aa5ab 5.15.0-143-generic #153-Ubuntu SMP Fri Jun 13 19:10:45 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 661dc6e
Default Java Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.27+6-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_452-8u452-gaus1-0ubuntu120.04-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/8/testReport/
Max. process+thread count 3559 (vs. ulimit of 5500)
modules C: hadoop-project hadoop-tools/hadoop-aws . U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/8/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2 shellcheck=0.7.0
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@ahmarsuhail
Copy link
Contributor

@steveloughran discovered completely by accident, but it's something to do with the checksumming code.

If you comment out these lines:

   //  builder.addPlugin(LegacyMd5Plugin.create());

    // do not do request checksums as this causes third-party store problems.
  //  builder.requestChecksumCalculation(RequestChecksumCalculation.WHEN_REQUIRED);

    // response checksum validation. Slow, even with CRC32 checksums.
//    if (parameters.isChecksumValidationEnabled()) {
//      builder.responseChecksumValidation(ResponseChecksumValidation.WHEN_SUPPORTED);
//    }

the test will pass. Could be something to do with s3Express not supporting md5, will look into it.

@ahmarsuhail
Copy link
Contributor

Specifically, it's this line: builder.requestChecksumCalculation(RequestChecksumCalculation.WHEN_REQUIRED); that causes this.

Comment that out, or change it to builder.requestChecksumCalculation(RequestChecksumCalculation.WHEN_SUPPORTED), it passes.

My guess is it's something to do with S3 express not supporting MD5, but for operations where RequestChecksumCalculation.WHEN_REQUIRED is true, SDK calculates the m5 and then S3 express rejects it.

Have asked the SDK team.

@steveloughran
Copy link
Contributor Author

ok, so maybe for s3express stores we don't do legacy MD5 plugin stuff all is good?

  1. Does imply the far end is breaking the connection when it is unhappy -at least our unit tests found this stuff before the cost of every HEAD doubles.
  2. maybe we should make the choice of checksums an enum with md5 the default, so it is something that can be turned off/changed in future.

While on the topic of S3 Express, is it now the case that because there's lifecycle rules for cleanup, LIST calls don't return prefixes of paths with incomplete uploads? If so I will need to change production code and the test -with a separate JIRA for that for completeness

@ahmarsuhail
Copy link
Contributor

for s3express stores we don't do legacy MD5 plugin stuff all is good

@steveloughran confirming with the SDK team, since the MD5 plugin is supposed to restore previous behaviour, the server rejecting the first request seems wrong. let's see what they have to say.

LIST calls don't return prefixes of paths with incomplete uploads

Will check with S3 express team on this

@steveloughran steveloughran force-pushed the s3/HADOOP-19654-aws-sdk-2.32 branch from 661dc6e to aa8e814 Compare September 19, 2025 13:23
@steveloughran
Copy link
Contributor Author

thanks. I don't see it on tests against s3 with the 2.29.52 release, so something is changing with the requests made with new SDK + MD5 stuff.

@ahmarsuhail
Copy link
Contributor

@steveloughran not able to narrow this error down just yet, it looks like it's a combination of S3A's configuration of the S3 client + these new Md5 changes.

  @Test
  public void testHead() throws Throwable {
   // S3Client s3Client = getFileSystem().getS3AInternals().getAmazonS3Client("test instance");

    S3Client s3Client = S3Client.builder().region(Region.US_EAST_1)
            .addPlugin(LegacyMd5Plugin.create())
            .requestChecksumCalculation(RequestChecksumCalculation.WHEN_REQUIRED)
            .responseChecksumValidation(ResponseChecksumValidation.WHEN_SUPPORTED)
            .overrideConfiguration(o -> o.retryStrategy(b -> b.maxAttempts(1)))
            .build();



    s3Client.headObject(HeadObjectRequest.builder().bucket("<>")
            .key("<>").build());
  }

I see the failure when the S3A client, and don't see it when I use a newly created client. So it's not just because of requestChecksumCalculation(RequestChecksumCalculation.WHEN_REQUIRED)

Looking into it some more.

S3 express team said there have been no changes in LIST behaviour.

@ahmarsuhail
Copy link
Contributor

able to reproduce the issue outside of S3A. Basically did what would happen when you run a test in S3A:

  • a probe for the test/ directory, and then create the test/ directory, and then do the headObject() call.

The head fails, but if you comment out requestChecksumCalculation(RequestChecksumCalculation.WHEN_REQUIRED) it works again.

no idea what's going on. but have shared this local reproduction with SDK team. And rules out that it's something in the S3A code.

public class TestClass {

    S3Client s3Client;

    public TestClass() {
        this.s3Client = S3Client.builder().region(Region.US_EAST_1)
                .addPlugin(LegacyMd5Plugin.create())
                .requestChecksumCalculation(RequestChecksumCalculation.WHEN_REQUIRED)
                .responseChecksumValidation(ResponseChecksumValidation.WHEN_SUPPORTED)
                .overrideConfiguration(o -> o.retryStrategy(b -> b.maxAttempts(1)))
                .build();
    }


    public void testS3Express(String bucket, String key) {
        s3Client.listObjectsV2(ListObjectsV2Request.builder()
                .bucket("<>")
                .maxKeys(2)
                .prefix("test/")
                .build());


        try {
            s3Client.headObject(HeadObjectRequest.builder().bucket("<>")
                    .key("test")
                    .build());
        } catch (Exception e) {
            System.out.println("Exception thrown: " + e.getMessage());
        }

        s3Client.putObject(PutObjectRequest
                .builder()
                .bucket("<>")
                .key("test/").build(), RequestBody.empty());

        s3Client.headObject(HeadObjectRequest.builder().bucket("<>")
                .key("<>")
                .build());
    }

@steveloughran
Copy link
Contributor Author

good explanation. Though I would have expected a bit broader test coverage of your own stores; something to look for on the next library update.

Can I also get improvements in error translation too -we need the error string including request IDs. Relying on the stack entry below to print it isn't enough, as deep exception nesting (hive, spark) can lose that.

@steveloughran
Copy link
Contributor Author

one more thing here: make sure you can handle null as an etag in the cache.
Not all stores have it, which is why it can be turned off for classic input stream version checking.

You won't be able to detect overwrites, but we can just document having a short TTL here.

@ahmarsuhail
Copy link
Contributor

@steveloughran updated exception handling: awslabs/analytics-accelerator-s3#361, next release will have include the requestIDs in the message, eg:

java.io.IOException: Server error accessing s3://xxx, request failed with: At least one of the pre-conditions you specified did not hold (Service: S3, Status Code: 412, Request ID: xxxx, Extended Request ID: xxxx)

The null as eTag will require more work, the only way to do that reliably is to disable the caching fully and provide a pass through stream. Do you know which stores don't support eTags?

This is just the library declaration change.
* create legacy MD5 headers
* downgrade request checksum calculation to "when required"
* set the (existing) checksum validation flag the same way so the
  builder library is happy
* add "none" as a valid checksum algorithm
* document that and the existing "unknown_to_sdk_version" option
* with troubleshooting
* ITestS3AChecksum modified to handle these checksum changes.
… of multiparts is 200 + error

* Recognise and wrap with RemoteFileChangedException
* Fix up assertions in test.
Add relevant statements for s3 access and skip all tests which
expect partial access to paths on a bucket.
…3 express

Revert test to original assert (no special treatment of s3 express),
and log Initiating GET request" for ease of log scanning.

Add to the log4.properties file the settings needed to turn on wire logging,
commented out by default.
…MD5 header

fs.s3a.checksum.calculation.enabled (default: true)
fs.s3a.md5.header.enabled (default: false)
The problem here is not the signing -it's that the test couldn't cope
with path-style-access as it is used the hostname of the http request
as the bucket name.

Fixed by adding a static field to note when path access is in use
and to map the name to "bucket" in such a situation.
Third-party stores may remove an ongoing MPU after it is committed,
so a retry will fail.

On AWS S3 it is simply ignored.

New option fs.s3a.mpu.commit.consumes.upload.id
can be set to true to change assertions in tests which would otherwise fail.
remove mention of CRC32 as checksum algorithm; don't assert that it is
null if requested version is is unknown.
New command "etag".
Retrieves and prints etags.

Updated release version

HADOOP-19654. Update third party documentation, especially on null etags

Input stream MUST be set to classic unless/until support there changes.
@steveloughran steveloughran force-pushed the s3/HADOOP-19654-aws-sdk-2.32 branch from 138bbee to 2e47690 Compare October 21, 2025 13:57
This includes one production-side change: FS automatically declares
that magic committers are disabled if MPU is disabled.
Final(?) wrap-up of tests which fail on a store without multipart upload
or bulk delete.
@steveloughran
Copy link
Contributor Author

latest iteration works with third party stores without MPU (so no magic or use of memory for upload buffering), or bulk delete.

tested google gcs, only underful buffers which can be ignored.

[ERROR] Failures: 
[ERROR]   ITestS3AContractUnbuffer>AbstractContractUnbufferTest.testMultipleUnbuffers:108->AbstractContractUnbufferTest.validateFullFileContents:141->AbstractContractUnbufferTest.validateFileContents:148 failed to read expected number of bytes from stream. This may be transient ==> expected: <1024> but was: <533>                                                          
[ERROR]   ITestS3AContractUnbuffer>AbstractContractUnbufferTest.testUnbufferAfterRead:61->AbstractContractUnbufferTest.validateFullFileContents:141->AbstractContractUnbufferTest.validateFileContents:148 failed to read expected number of bytes from stream. This may be transient ==> expected: <1024> but was: <533>                                                           
[ERROR]   ITestS3AContractUnbuffer>AbstractContractUnbufferTest.testUnbufferBeforeRead:71->AbstractContractUnbufferTest.validateFullFileContents:141->AbstractContractUnbufferTest.validateFileContents:148 failed to read expected number of bytes from stream. This may be transient ==> expected: <1024> but was: <539>                                                          
[ERROR]   ITestS3AContractUnbuffer>AbstractContractUnbufferTest.testUnbufferOnClosedFile:91->AbstractContractUnbufferTest.validateFullFileContents:141->AbstractContractUnbufferTest.validateFileContents:148 failed to read expected number of bytes from stream. This may be transient ==> expected: <1024> but was: <539>                                                        
[INFO] 
[ERROR] Tests run: 1253, Failures: 4, Errors: 0, Skipped: 450
[INFO] 

@steveloughran
Copy link
Contributor Author

@ahmarsuhail I think Apache Ozone is the one.

I just added an etag command to cloudstore to print this stuff out and experimented with various stores: https://github.com/steveloughran/cloudstore/blob/main/src/main/site/etag.md

dell ECS and Google both supply etags. We don't retrieve them for directory markers anyway, which isn't an issue

  • I've updated the third-party docs to cover etags in more detail, and say "switch to classic and disable version checking"
  • I do think the cache needs to handle null/empty string tags, somehow. Certainly by not caching metadata.

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 15m 2s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+0 🆗 shelldocs 0m 0s Shelldocs was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 38 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 10m 43s Maven dependency ordering for branch
+1 💚 mvninstall 29m 3s trunk passed
+1 💚 compile 15m 53s trunk passed with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04
+1 💚 compile 15m 41s trunk passed with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
+1 💚 checkstyle 3m 15s trunk passed
-1 ❌ mvnsite 10m 51s /branch-mvnsite-root.txt root in trunk failed.
+1 💚 javadoc 9m 36s trunk passed with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04
+1 💚 javadoc 8m 31s trunk passed with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
+0 🆗 spotbugs 0m 28s branch/hadoop-project no spotbugs output file (spotbugsXml.xml)
-1 ❌ spotbugs 1m 18s /branch-spotbugs-hadoop-tools_hadoop-aws-warnings.html hadoop-tools/hadoop-aws in trunk has 188 extant spotbugs warnings.
-1 ❌ spotbugs 36m 27s /branch-spotbugs-root-warnings.html root in trunk has 9241 extant spotbugs warnings.
+1 💚 shadedclient 61m 43s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 48s Maven dependency ordering for patch
+1 💚 mvninstall 27m 26s the patch passed
+1 💚 compile 14m 53s the patch passed with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04
+1 💚 javac 14m 53s the patch passed
+1 💚 compile 15m 24s the patch passed with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
+1 💚 javac 15m 24s the patch passed
-1 ❌ blanks 0m 0s /blanks-eol.txt The patch has 24 line(s) that end in blanks. Use git apply --whitespace=fix <<patch_file>>. Refer https://git-scm.com/docs/git-apply
-0 ⚠️ checkstyle 3m 15s /results-checkstyle-root.txt root: The patch generated 14 new + 79 unchanged - 6 fixed = 93 total (was 85)
-1 ❌ mvnsite 7m 2s /patch-mvnsite-root.txt root in the patch failed.
+1 💚 shellcheck 0m 0s No new issues.
-1 ❌ javadoc 9m 37s /results-javadoc-javadoc-root-jdkUbuntu-21.0.7+6-Ubuntu-0ubuntu120.04.txt root-jdkUbuntu-21.0.7+6-Ubuntu-0ubuntu120.04 with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04 generated 4 new + 46184 unchanged - 0 fixed = 46188 total (was 46184)
-1 ❌ javadoc 8m 41s /results-javadoc-javadoc-root-jdkUbuntu-17.0.15+6-Ubuntu-0ubuntu120.04.txt root-jdkUbuntu-17.0.15+6-Ubuntu-0ubuntu120.04 with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04 generated 4 new + 43015 unchanged - 0 fixed = 43019 total (was 43015)
+0 🆗 spotbugs 0m 22s hadoop-project has no data from spotbugs
+1 💚 shadedclient 62m 29s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 741m 49s /patch-unit-root.txt root in the patch passed.
+1 💚 asflicense 1m 44s The patch does not generate ASF License warnings.
1085m 5s
Reason Tests
Failed junit tests hadoop.yarn.sls.appmaster.TestAMSimulator
hadoop.security.ssl.TestDelegatingSSLSocketFactory
hadoop.yarn.service.TestYarnNativeServices
hadoop.yarn.server.router.webapp.TestFederationWebApp
hadoop.yarn.server.router.subcluster.fair.TestYarnFederationWithFairScheduler
hadoop.yarn.server.router.webapp.TestRouterWebServicesREST
hadoop.yarn.server.nodemanager.containermanager.logaggregation.TestLogAggregationService
Subsystem Report/Notes
Docker ClientAPI=1.51 ServerAPI=1.51 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/14/artifact/out/Dockerfile
GITHUB PR #7882
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle markdownlint shellcheck shelldocs
uname Linux 8fbc6faf5962 5.15.0-156-generic #166-Ubuntu SMP Sat Aug 9 00:02:46 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 1cd8a28
Default Java Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
Multi-JDK versions /usr/lib/jvm/java-21-openjdk-amd64:Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-17-openjdk-amd64:Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/14/testReport/
Max. process+thread count 3717 (vs. ulimit of 5500)
modules C: hadoop-project hadoop-tools/hadoop-aws . U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/14/console
versions git=2.25.1 maven=3.9.11 spotbugs=4.9.7 shellcheck=0.7.0
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@steveloughran steveloughran marked this pull request as ready for review October 22, 2025 13:29
Address style, trailing space and javadoc.

Not addressed: the new spotbugs errors. These are not from this patch.
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 8m 27s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+0 🆗 shelldocs 0m 0s Shelldocs was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 39 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 7m 18s Maven dependency ordering for branch
+1 💚 mvninstall 15m 15s trunk passed
+1 💚 compile 8m 18s trunk passed with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04
+1 💚 compile 8m 18s trunk passed with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
+1 💚 checkstyle 1m 25s trunk passed
-1 ❌ mvnsite 6m 10s /branch-mvnsite-root.txt root in trunk failed.
+1 💚 javadoc 5m 31s trunk passed with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04
+1 💚 javadoc 4m 43s trunk passed with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
+0 🆗 spotbugs 0m 15s branch/hadoop-project no spotbugs output file (spotbugsXml.xml)
-1 ❌ spotbugs 0m 41s /branch-spotbugs-hadoop-tools_hadoop-aws-warnings.html hadoop-tools/hadoop-aws in trunk has 188 extant spotbugs warnings.
-1 ❌ spotbugs 18m 42s /branch-spotbugs-root-warnings.html root in trunk has 9241 extant spotbugs warnings.
+1 💚 shadedclient 32m 22s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 33s Maven dependency ordering for patch
+1 💚 mvninstall 15m 11s the patch passed
+1 💚 compile 7m 52s the patch passed with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04
+1 💚 javac 7m 52s the patch passed
+1 💚 compile 8m 17s the patch passed with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
+1 💚 javac 8m 17s the patch passed
-1 ❌ blanks 0m 0s /blanks-eol.txt The patch has 24 line(s) that end in blanks. Use git apply --whitespace=fix <<patch_file>>. Refer https://git-scm.com/docs/git-apply
-0 ⚠️ checkstyle 1m 33s /results-checkstyle-root.txt root: The patch generated 14 new + 79 unchanged - 6 fixed = 93 total (was 85)
-1 ❌ mvnsite 3m 43s /patch-mvnsite-root.txt root in the patch failed.
+1 💚 shellcheck 0m 0s No new issues.
-1 ❌ javadoc 5m 23s /results-javadoc-javadoc-root-jdkUbuntu-21.0.7+6-Ubuntu-0ubuntu120.04.txt root-jdkUbuntu-21.0.7+6-Ubuntu-0ubuntu120.04 with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04 generated 4 new + 46184 unchanged - 0 fixed = 46188 total (was 46184)
-1 ❌ javadoc 4m 38s /results-javadoc-javadoc-root-jdkUbuntu-17.0.15+6-Ubuntu-0ubuntu120.04.txt root-jdkUbuntu-17.0.15+6-Ubuntu-0ubuntu120.04 with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04 generated 4 new + 43015 unchanged - 0 fixed = 43019 total (was 43015)
+0 🆗 spotbugs 0m 12s hadoop-project has no data from spotbugs
+1 💚 shadedclient 32m 15s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 594m 47s /patch-unit-root.txt root in the patch passed.
+1 💚 asflicense 0m 50s The patch does not generate ASF License warnings.
779m 27s
Reason Tests
Failed junit tests hadoop.yarn.server.router.subcluster.fair.TestYarnFederationWithFairScheduler
hadoop.yarn.server.router.webapp.TestFederationWebApp
hadoop.yarn.server.router.webapp.TestRouterWebServicesREST
hadoop.yarn.server.resourcemanager.reservation.TestCapacityOverTimePolicy
hadoop.hdfs.server.balancer.TestBalancerWithHANameNodes
hadoop.hdfs.server.federation.router.async.TestRouterAsyncRpcClient
hadoop.security.ssl.TestDelegatingSSLSocketFactory
hadoop.yarn.sls.appmaster.TestAMSimulator
Subsystem Report/Notes
Docker ClientAPI=1.51 ServerAPI=1.51 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/15/artifact/out/Dockerfile
GITHUB PR #7882
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle markdownlint shellcheck shelldocs
uname Linux c90f744f2220 5.15.0-156-generic #166-Ubuntu SMP Sat Aug 9 00:02:46 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 96c3f38
Default Java Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
Multi-JDK versions /usr/lib/jvm/java-21-openjdk-amd64:Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-17-openjdk-amd64:Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/15/testReport/
Max. process+thread count 4624 (vs. ulimit of 5500)
modules C: hadoop-project hadoop-tools/hadoop-aws . U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/15/console
versions git=2.25.1 maven=3.9.11 spotbugs=4.9.7 shellcheck=0.7.0
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@steveloughran
Copy link
Contributor Author

fun test run today, against s3 london. Most of the multipart upload/commit tests were failing "missing part", from cli or IDE. Testing with S3 express was happy. (-Dparallel-tests -DtestsThreadCount=8 -Panalytics -Dscale)

[ERROR]   ITestS3AHugeMagicCommits.test_030_postCreationAssertions:192 » AWSBadRequest Completing multipart upload on job-00/test/tests3ascale/ITestS3AHugeMagicCommits/commit/commit.bin: software.amazon.awssdk.services.s3.model.S3Exception: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: JAEYPCZ4P3JYGMTD, Extended Request ID: O/135mw9Xd2aEuFUh0ICWYc8DLXSpBUWaVGkEgEFGf0xO8o+XlZXY0hI+mvennOGt+C/UI7mNrQ=) (SDK Attempt Count: 1):InvalidPart: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: JAEYPCZ4P3JYGMTD, Extended Request ID: O/135mw9Xd2aEuFUh0ICWYc8DLXSpBUWaVGkEgEFGf0xO8o+XlZXY0hI+mvennOGt+C/UI7mNrQ=) (SDK Attempt Count: 1)                                                                                                                                                                                   
[ERROR]   ITestS3AHugeMagicCommits>AbstractSTestS3AHugeFiles.test_045_vectoredIOHugeFile:538->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/ITestS3AHugeMagicCommits/commit/commit.bin in s3a://stevel-london/job-00/test/tests3ascale/ITestS3AHugeMagicCommits/commit                                                                                                                                                      
[ERROR]   ITestS3AHugeFilesArrayBlocks>AbstractSTestS3AHugeFiles.test_010_CreateHugeFile:276 » AWSBadRequest Completing multipart upload on job-00/test/tests3ascale/array/src/hugefile: software.amazon.awssdk.services.s3.model.S3Exception: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: 1NNBCSX4NCDN7G9X, Extended Request ID: 8vMmeyt1GfjGrf3UL9AN8vlwWSn9860f1gdeIBC3drmcjeQwC6wOPinMD8MSO6ggGw9ywwdcXroGTdVSFLYq0S0VdM/5bYfanDXJ43Eb4QU=) (SDK Attempt Count: 1):InvalidPart: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: 1NNBCSX4NCDN7G9X, Extended Request ID: 8vMmeyt1GfjGrf3UL9AN8vlwWSn9860f1gdeIBC3drmcjeQwC6wOPinMD8MSO6ggGw9ywwdcXroGTdVSFLYq0S0VdM/5bYfanDXJ43Eb4QU=) (SDK Attempt Count: 1)                                                                                                                     
[ERROR]   ITestS3AHugeFilesArrayBlocks>AbstractSTestS3AHugeFiles.test_030_postCreationAssertions:433 » FileNotFound Huge file: not found s3a://stevel-london/job-00/test/tests3ascale/array/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/array/src                                                                                                                  
[ERROR]   ITestS3AHugeFilesArrayBlocks>AbstractSTestS3AHugeFiles.test_040_PositionedReadHugeFile:478->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/array/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/array/src  
[ERROR]   ITestS3AHugeFilesArrayBlocks>AbstractSTestS3AHugeFiles.test_045_vectoredIOHugeFile:538->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/array/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/array/src      
[ERROR]   ITestS3AHugeFilesArrayBlocks>AbstractSTestS3AHugeFiles.test_050_readHugeFile:624->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/array/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/array/src            
[ERROR]   ITestS3AHugeFilesArrayBlocks>AbstractSTestS3AHugeFiles.test_100_renameHugeFile:679->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/array/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/array/src          
[ERROR]   ITestS3AHugeFilesByteBufferBlocks>AbstractSTestS3AHugeFiles.test_010_CreateHugeFile:276 » AWSBadRequest Completing multipart upload on job-00/test/tests3ascale/bytebuffer/src/hugefile: software.amazon.awssdk.services.s3.model.S3Exception: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: K0K75V8AH7SVBHS3, Extended Request ID: kDosbp+Z2PLZn9tVtRF9QfOqh1MgLbIKYaYFn2JeIptXlBV4v1a/wFukoXnaF7fCp6zx3vR8feE0fScUJEw+WhNW9lzu9dBxssOA62UA2kg=) (SDK Attempt Count: 1):InvalidPart: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: K0K75V8AH7SVBHS3, Extended Request ID: kDosbp+Z2PLZn9tVtRF9QfOqh1MgLbIKYaYFn2JeIptXlBV4v1a/wFukoXnaF7fCp6zx3vR8feE0fScUJEw+WhNW9lzu9dBxssOA62UA2kg=) (SDK Attempt Count: 1)                                                                                                           
[ERROR]   ITestS3AHugeFilesByteBufferBlocks>AbstractSTestS3AHugeFiles.test_030_postCreationAssertions:433 » FileNotFound Huge file: not found s3a://stevel-london/job-00/test/tests3ascale/bytebuffer/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/bytebuffer/src                                                                                                   
[ERROR]   ITestS3AHugeFilesByteBufferBlocks>AbstractSTestS3AHugeFiles.test_040_PositionedReadHugeFile:478->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/bytebuffer/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/bytebuffer/src                                                                                                                                                                             
[ERROR]   ITestS3AHugeFilesByteBufferBlocks>AbstractSTestS3AHugeFiles.test_045_vectoredIOHugeFile:538->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/bytebuffer/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/bytebuffer/src                                                                                                                                                                                 
[ERROR]   ITestS3AHugeFilesByteBufferBlocks>AbstractSTestS3AHugeFiles.test_050_readHugeFile:624->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/bytebuffer/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/bytebuffer/src                                                                                                                                                                                       
[ERROR]   ITestS3AHugeFilesByteBufferBlocks>AbstractSTestS3AHugeFiles.test_100_renameHugeFile:679->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/bytebuffer/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/bytebuffer/src                                                                                                                                                                                     
[ERROR]   ITestS3AHugeFilesDiskBlocks>AbstractSTestS3AHugeFiles.test_010_CreateHugeFile:276 » AWSBadRequest Completing multipart upload on job-00/test/tests3ascale/disk/src/hugefile: software.amazon.awssdk.services.s3.model.S3Exception: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: 73T4YAYRWE63WAW5, Extended Request ID: 6ucEY2heh2NsxE8dBrlZp9AE4Tb+hbvnyxea1/yp5H85BEvkQdYsfNlRH5XZM1g4hHPDSoGMVtM=) (SDK Attempt Count: 1):InvalidPart: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: 73T4YAYRWE63WAW5, Extended Request ID: 6ucEY2heh2NsxE8dBrlZp9AE4Tb+hbvnyxea1/yp5H85BEvkQdYsfNlRH5XZM1g4hHPDSoGMVtM=) (SDK Attempt Count: 1)                                                                                                                                                                                       
[ERROR]   ITestS3AHugeFilesDiskBlocks>AbstractSTestS3AHugeFiles.test_030_postCreationAssertions:433 » FileNotFound Huge file: not found s3a://stevel-london/job-00/test/tests3ascale/disk/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/disk/src                                                                                                                     
[ERROR]   ITestS3AHugeFilesDiskBlocks>AbstractSTestS3AHugeFiles.test_040_PositionedReadHugeFile:478->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/disk/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/disk/src     
[ERROR]   ITestS3AHugeFilesDiskBlocks>AbstractSTestS3AHugeFiles.test_045_vectoredIOHugeFile:538->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/disk/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/disk/src         
[ERROR]   ITestS3AHugeFilesDiskBlocks>AbstractSTestS3AHugeFiles.test_050_readHugeFile:624->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/disk/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/disk/src               
[ERROR]   ITestS3AHugeFilesDiskBlocks>AbstractSTestS3AHugeFiles.test_100_renameHugeFile:679->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/disk/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/disk/src             
[ERROR]   ITestS3AHugeFilesSSECDiskBlocks>AbstractSTestS3AHugeFiles.test_010_CreateHugeFile:276 » AWSBadRequest Completing multipart upload on job-00/test/tests3ascale/disk/src/hugefile: software.amazon.awssdk.services.s3.model.S3Exception: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: ZSY181YB49GQFR83, Extended Request ID: FrPEfsXO3Gbhxi3m4ZmyYSiyfscQ1QSm/1lKjRPLHEbLWH5vtGked+fHvZl281Dm6u013/5VP6pj42h4XISftk7p9uEIDGw31E7Ymcoviq4=) (SDK Attempt Count: 1):InvalidPart: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: ZSY181YB49GQFR83, Extended Request ID: FrPEfsXO3Gbhxi3m4ZmyYSiyfscQ1QSm/1lKjRPLHEbLWH5vtGked+fHvZl281Dm6u013/5VP6pj42h4XISftk7p9uEIDGw31E7Ymcoviq4=) (SDK Attempt Count: 1)                                                                                                                   
[ERROR]   ITestS3AHugeFilesSSECDiskBlocks>AbstractSTestS3AHugeFiles.test_030_postCreationAssertions:433 » FileNotFound Huge file: not found s3a://stevel-london/job-00/test/tests3ascale/disk/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/disk/src                                                                                                                 
[ERROR]   ITestS3AHugeFilesSSECDiskBlocks>AbstractSTestS3AHugeFiles.test_040_PositionedReadHugeFile:478->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/disk/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/disk/src 
[ERROR]   ITestS3AHugeFilesSSECDiskBlocks>AbstractSTestS3AHugeFiles.test_045_vectoredIOHugeFile:538->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/disk/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/disk/src     
[ERROR]   ITestS3AHugeFilesSSECDiskBlocks>AbstractSTestS3AHugeFiles.test_050_readHugeFile:624->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/disk/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/disk/src           
[ERROR]   ITestS3AHugeFilesSSECDiskBlocks>AbstractSTestS3AHugeFiles.test_100_renameHugeFile:679->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/disk/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/disk/src         
[ERROR]   ITestS3AHugeFilesStorageClass.test_010_CreateHugeFile:74->AbstractSTestS3AHugeFiles.test_010_CreateHugeFile:276 » AWSBadRequest Completing multipart upload on job-00/test/tests3ascale/array/src/hugefile: software.amazon.awssdk.services.s3.model.S3Exception: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: APYCQNP1GY02DGDE, Extended Request ID: lE0hQJ67sSwCYSMmO7tDEAvEIOCcpwIbLdfqqrNTpWT0bHIaacaIEzZusajj79rnFQlWudxsMHBIUXdS9ELiKR0T923lcULZy4Essx1LoTs=) (SDK Attempt Count: 1):InvalidPart: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: APYCQNP1GY02DGDE, Extended Request ID: lE0hQJ67sSwCYSMmO7tDEAvEIOCcpwIbLdfqqrNTpWT0bHIaacaIEzZusajj79rnFQlWudxsMHBIUXdS9ELiKR0T923lcULZy4Essx1LoTs=) (SDK Attempt Count: 1)                                                                                        
[ERROR]   ITestS3AHugeFilesStorageClass.test_030_postCreationAssertions:81->AbstractSTestS3AHugeFiles.test_030_postCreationAssertions:433 » FileNotFound Huge file: not found s3a://stevel-london/job-00/test/tests3ascale/array/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/array/src                                                                             
[ERROR]   ITestS3AHugeFilesStorageClass>AbstractSTestS3AHugeFiles.test_045_vectoredIOHugeFile:538->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/array/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/array/src     
[ERROR]   ITestS3AHugeFilesStorageClass.test_100_renameHugeFile:108->AbstractSTestS3AHugeFiles.assumeHugeFileExists:404->AbstractSTestS3AHugeFiles.assumeFileExists:414 » FileNotFound huge file not created: not found s3a://stevel-london/job-00/test/tests3ascale/array/src/hugefile in s3a://stevel-london/job-00/test/tests3ascale/array/src                                   
[INFO] 
[ERROR] Tests run: 124, Failures: 1, Errors: 30, Skipped: 13
[INFO] 

This has to be some transient issue with my s3 london bucket, as if in progress upload parts were not being retained. Never seen this before; the expiry time is set to 24h

When these uploads fail we do leave incomplete uploads in progress:

Listing uploads under path ""
job-00-fork-0005/test/testCommitOperations 141OKG11JHhWF1GOnunHUd9ZzBJ8cUG9z0LsW_4wUGgCXCvDMQM3kRi5IOCUV8FdCHtg_w8SlipfubRtzCQoT5yEpOLv.cWOiOwjEaBzUjnuJORppfXuKy1piHpLnu98
job-00-fork-0005/test/testIfMatchTwoMultipartUploadsRaceConditionOneClosesFirst yBJpm3zh4DjNQIDtyWgEmWVCk5sehVz5Vzn3QGr_tQT2iOonRp5ErXsQy24yIvnzRxBCZqVapy5VepLeu2udZBT5EXLnKRA3bchvzjtKDlipywSzYlL2N_xLUDCT359I
job-00-fork-0005/test/testIfNoneMatchTwoConcurrentMultipartUploads AnspJPHUoPJqg61t28OvLfAogi6G9ocyx1Dm6XY2C.a_H_onklM0Nr0LIXaPiYlQjZIiH0fTsQ1e2KhEjS9pGxvSKOXq_4YibiGZmFC6rBolmfACMqIRpoeaqYDgzYW4
job-00-fork-0005/test/testMagicWriteRecovery/file.txt KpvoTuVh85Wzm9XuU1EuxbATjb6D.Zv8vEj3z2S6AvJBHCBssy4iphxNhTkLDs7ceEwak4IPtdXED1vRf3geXT7MRMJn8d6feafvHVEgzbD31odpzTLmOaPrU_mFQXGV
job-00-fork-0005/test/testMagicWriteRecovery/file.txt CnrbWU3pzgEGvjRuDuaP43Xcv1eBF5aLknqYaZA1vwO3b1QUIu9QJSiZjuLMYKT9GKw1QXwqoKo4iuxTY1a18bARx4XMEiL98kZBv0TPMaAfXE.70Olh8Q2kTyDlUCSh
job-00-fork-0005/test/testMagicWriteRecovery/file.txt dEVGPBRsuOAzL5pGA02ve9qJhAlNK8lb8khF6laKjo9U0j_aG1xLkHEfPLrmcrcsLxC3R755Yv_uKbzY_Vnoc.nXCprvutM1TZmLLN_7LHrQ0tY0IjYSS6hVzDVlHbvC
job-00-fork-0006/test/restricted/testCommitEmptyFile/empty-commit.txt NOCjVJqycZhkalrvU26F5oIaJP51q055et2N6b74.2JVjiKL8KwrhOhdrtumOrZ2tZWNqaK4iKZ_iosqgehJOiPbWJwxvrfvA5V.dAUTLNqjtEf5tfWh0UXu.vahDy_S5SSgNLFXK.VB82i5MZtOcw--
job-00/test/tests3ascale/ITestS3AHugeMagicCommits/commit/commit.bin lsYNpdn_oiWLwEVvvM621hCvIwDVaL4y_bbwVpQouW1OBThA.P9cR8fZtxvBjGdMY41UH0dTjxGHtF3BXEY8WXqmcnO9QHs_Jy.os781pE3MGzqgzFyxmd0yN6LFcTbq
test/restricted/testCommitEmptyFile/empty-commit.txt T3W9V56Bv_FMhKpgcBgJ1H2wOBkPKk23T0JomesBzZyqiIAu3NiROibAgoZUhWSdoTKSJoOgcn3UWYGOvGBbsHteS_N_c1QoTEp0GE7PNlzDfs1GheJ5SOpUgaEY6MaYdNe0mn0gY48FDXpVB2nqiA--
test/restricted/testCommitEmptyFile/empty-commit.txt .cr4b3xkfze4N24Bj3PAm_ACIyIVuTU4DueDktU1abNu2LJWXH2HKnUu1oOjfnnQwnUXp4VmXBVbZ5aq8E8gVCxN.Oyb7hmGVtESmRjpqIXSW80JrB_0_dqXe.uAT.JH7kEWywAlb4NIqJ5Xz99tvA--
Total 10 uploads found.

Most interesting here is testIfNoneMatchTwoConcurrentMultipartUploads, because this initiates then completes an MPU, so as to create a zero byte file. It doesn't upload any parts.

The attempt to complete failed.

[ERROR]   ITestS3APutIfMatchAndIfNoneMatch.testIfNoneMatchTwoConcurrentMultipartUploads:380->createFileWithFlags:190 » AWSBadRequest Completing multipart upload on job-00-fork-0005/test/testIfNoneMatchTwoConcurrentMultipartUploads: software.amazon.awssdk.services.s3.model.S3Exception: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: 9JCJ6M5QRDGJNYYS, Extended Request ID: Z7Q7+LA0o/5B4xoIGhgo+tVppawZ0UBj7X4RNb+0m9RbOAOwD/Apv1o+KmnW0aypjwmfFlarxjo=) (SDK Attempt Count: 1):InvalidPart: One or more of the specified parts could not be found.  The part may not have been uploaded, or the specified entity tag may not match the part's entity tag. (Service: S3, Status Code: 400, Request ID: 9JCJ6M5QRDGJNYYS, Extended Request ID: Z7Q7+LA0o/5B4xoIGhgo+tVppawZ0UBj7X4RNb+0m9RbOAOwD/Apv1o+KmnW0aypjwmfFlarxjo=) (SDK Attempt Count: 1)          

Yet the uploads list afterwards finds it

job-00-fork-0005/test/testIfNoneMatchTwoConcurrentMultipartUploads AnspJPHUoPJqg61t28OvLfAogi6G9ocyx1Dm6XY2C.a_H_onklM0Nr0LIXaPiYlQjZIiH0fTsQ1e2KhEjS9pGxvSKOXq_4YibiGZmFC6rBolmfACMqIRpoeaqYDgzYW4

I have to conclude that the list of pending uploads was briefly offline/inconsistent.

This is presumably so, so rare that there's almost no point retrying here. With no retries, every active write/job would have failed, even though the system had recovered within a minute.

Maybe we should retry here? I remember a long long time ago the v1 sdk didn't retry on failures of the final POST to commit an upload, and how that sporadically caused problems. Retrying on MPU failures will allow for recovery in the presence of a transient failure here, and the cost of "deletion of all pending uploads will take longer to fail all active uploads".

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 19s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+0 🆗 shelldocs 0m 0s Shelldocs was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 39 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 10m 42s Maven dependency ordering for branch
+1 💚 mvninstall 15m 48s trunk passed
+1 💚 compile 8m 10s trunk passed with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04
+1 💚 compile 8m 15s trunk passed with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
+1 💚 checkstyle 1m 36s trunk passed
-1 ❌ mvnsite 6m 44s /branch-mvnsite-root.txt root in trunk failed.
+1 💚 javadoc 5m 35s trunk passed with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04
+1 💚 javadoc 4m 51s trunk passed with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
+0 🆗 spotbugs 0m 15s branch/hadoop-project no spotbugs output file (spotbugsXml.xml)
-1 ❌ spotbugs 0m 41s /branch-spotbugs-hadoop-tools_hadoop-aws-warnings.html hadoop-tools/hadoop-aws in trunk has 188 extant spotbugs warnings.
-1 ❌ spotbugs 19m 4s /branch-spotbugs-root-warnings.html root in trunk has 9241 extant spotbugs warnings.
+1 💚 shadedclient 34m 44s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 34s Maven dependency ordering for patch
+1 💚 mvninstall 16m 48s the patch passed
+1 💚 compile 8m 49s the patch passed with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04
+1 💚 javac 8m 49s the patch passed
+1 💚 compile 9m 26s the patch passed with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
+1 💚 javac 9m 26s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 1m 41s root: The patch generated 0 new + 79 unchanged - 6 fixed = 79 total (was 85)
-1 ❌ mvnsite 4m 11s /patch-mvnsite-root.txt root in the patch failed.
+1 💚 shellcheck 0m 0s No new issues.
+1 💚 javadoc 5m 17s root-jdkUbuntu-21.0.7+6-Ubuntu-0ubuntu120.04 with JDK Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04 generated 0 new + 46182 unchanged - 2 fixed = 46182 total (was 46184)
+1 💚 javadoc 4m 44s root-jdkUbuntu-17.0.15+6-Ubuntu-0ubuntu120.04 with JDK Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04 generated 0 new + 43013 unchanged - 2 fixed = 43013 total (was 43015)
+0 🆗 spotbugs 0m 11s hadoop-project has no data from spotbugs
+1 💚 shadedclient 34m 53s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 591m 32s /patch-unit-root.txt root in the patch passed.
+1 💚 asflicense 0m 51s The patch does not generate ASF License warnings.
781m 38s
Reason Tests
Failed junit tests hadoop.yarn.server.router.subcluster.fair.TestYarnFederationWithFairScheduler
hadoop.yarn.server.router.webapp.TestFederationWebApp
hadoop.yarn.server.router.webapp.TestRouterWebServicesREST
hadoop.hdfs.tools.TestDFSAdmin
hadoop.security.ssl.TestDelegatingSSLSocketFactory
hadoop.yarn.sls.appmaster.TestAMSimulator
Subsystem Report/Notes
Docker ClientAPI=1.51 ServerAPI=1.51 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/17/artifact/out/Dockerfile
GITHUB PR #7882
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle markdownlint shellcheck shelldocs
uname Linux 54d25015775c 5.15.0-156-generic #166-Ubuntu SMP Sat Aug 9 00:02:46 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / fa906dc
Default Java Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
Multi-JDK versions /usr/lib/jvm/java-21-openjdk-amd64:Ubuntu-21.0.7+6-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-17-openjdk-amd64:Ubuntu-17.0.15+6-Ubuntu-0ubuntu120.04
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/17/testReport/
Max. process+thread count 4675 (vs. ulimit of 5500)
modules C: hadoop-project hadoop-tools/hadoop-aws . U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7882/17/console
versions git=2.25.1 maven=3.9.11 spotbugs=4.9.7 shellcheck=0.7.0
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants