noobaa / noobaa-core

High-performance S3 application gateway to any backend - file / s3-compatible / multi-clouds / caching / replication ...
https://www.noobaa.io
Apache License 2.0
269 stars 78 forks source link

NSFS: For clients like s3A, iceberg.. seeing 0 response in listing of a bucket with 5.15.3 #8050

Closed madhuthorat closed 4 months ago

madhuthorat commented 4 months ago

Environment info

Actual behavior

  1. From customer - we test 5.15.3 on a standalone Server to evaluate that the bucket-dir stuff is already working. We see here at first a fix that is working in our usecase to interprete the ‘/’ correctly. But we see also following difference that should be investigate: If i make a s3 ls s3://bucket/dir/ listing we see this:
    
    noobaa-core 5.15.3  S3:

[ps60@cbd00033 ~]$ alias-s3 s3api head-object --bucket dev-product-external-adobe --key testDirectoryIsDirectoryAfterMkdir AcceptRanges: bytes ContentLength: 0 ContentType: application/x-directory ETag: '"mtime-d1999ufdnx1c-ino-1adj4"' LastModified: '2024-05-14T09:21:49+00:00' Metadata: {}

Minio (or AWS S3) [ps60@cbd00033 ~]$ alias minio='AWS_CA_BUNDLE=/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem AWS_ACCESS_KEY_ID=hive AWS_SECRET_ACCESS_KEY=hivehive aws --endpoint https://minio-api.bdk8s.lan.huk-coburg.de/'

[ps60@cbd00033 ~]$ minio s3 ls dev-product-external-adobe/testDirectoryIsDirectoryAfterMkdir/ 2024-04-26 15:07:56 0 2024-04-26 15:07:56 0 testfile

[ps60@cbd00033 ~]$ minio s3api head-object --bucket dev-product-external-adobe --key testDirectoryIsDirectoryAfterMkdir An error occurred (404) when calling the HeadObject operation: Not Found

For some clients s3A, iceberg,.. that had problem with these difference in listing of a dir-bucket in gpfs based nsfs with 4096 and on other endpoints with 0 response.

### Expected behavior
1. Do not expect 0 response in bucket listing output for clients like minio etc.

### Steps to reproduce
1.

### More information - Screenshots / Logs / Other output
With logs

Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties 24/05/14 11:30:06 INFO SparkContext: Running Spark version 3.5.1 24/05/14 11:30:06 INFO SparkContext: OS info Windows 10, 10.0, amd64 24/05/14 11:30:06 INFO SparkContext: Java version 11.0.6 24/05/14 11:30:07 WARN Shell: Did not find winutils.exe: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems 24/05/14 11:30:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 24/05/14 11:30:07 INFO ResourceUtils: ============================================================== 24/05/14 11:30:07 INFO ResourceUtils: No custom resources configured for spark.driver. 24/05/14 11:30:07 INFO ResourceUtils: ============================================================== 24/05/14 11:30:07 INFO SparkContext: Submitted application: Spark-Test 24/05/14 11:30:07 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 24/05/14 11:30:07 INFO ResourceProfile: Limiting resource is cpu 24/05/14 11:30:07 INFO ResourceProfileManager: Added ResourceProfile id: 0 24/05/14 11:30:07 INFO SecurityManager: Changing view acls to: ps60 24/05/14 11:30:07 INFO SecurityManager: Changing modify acls to: ps60 24/05/14 11:30:07 INFO SecurityManager: Changing view acls groups to: 24/05/14 11:30:07 INFO SecurityManager: Changing modify acls groups to: 24/05/14 11:30:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: ps60; groups with view permissions: EMPTY; users with modify permissions: ps60; groups with modify permissions: EMPTY 24/05/14 11:30:08 INFO Utils: Successfully started service 'sparkDriver' on port 58218. 24/05/14 11:30:08 INFO SparkEnv: Registering MapOutputTracker WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/Wks/Eclipse/Maven/repository/org/apache/spark/spark-unsafe_2.13/3.5.1/spark-unsafe_2.13-3.5.1.jar) to constructor java.nio.DirectByteBuffer(long,int) WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release 24/05/14 11:30:08 INFO SparkEnv: Registering BlockManagerMaster 24/05/14 11:30:08 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 24/05/14 11:30:08 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 24/05/14 11:30:08 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 24/05/14 11:30:08 INFO DiskBlockManager: Created local directory at C:\Users\ps60\AppData\Local\Temp\blockmgr-0725fd5d-5579-4958-93f6-596fa9419175 24/05/14 11:30:08 INFO MemoryStore: MemoryStore started with capacity 2.2 GiB 24/05/14 11:30:08 INFO SparkEnv: Registering OutputCommitCoordinator 24/05/14 11:30:08 INFO JettyUtils: Start Jetty 0.0.0.0:4040 for SparkUI 24/05/14 11:30:08 INFO Utils: Successfully started service 'SparkUI' on port 4040. 24/05/14 11:30:08 INFO Executor: Starting executor ID driver on host VEN10648.lan.huk-coburg.de 24/05/14 11:30:08 INFO Executor: OS info Windows 10, 10.0, amd64 24/05/14 11:30:08 INFO Executor: Java version 11.0.6 24/05/14 11:30:08 INFO Executor: Starting executor with user classpath (userClassPathFirst = false): '' 24/05/14 11:30:08 INFO Executor: Created or updated repl class loader org.apache.spark.util.MutableURLClassLoader@1de4285e for default. 24/05/14 11:30:08 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 58229. 24/05/14 11:30:08 INFO NettyBlockTransferService: Server created on VEN10648.lan.huk-coburg.de:58229 24/05/14 11:30:08 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 24/05/14 11:30:08 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, VEN10648.lan.huk-coburg.de, 58229, None) 24/05/14 11:30:08 INFO BlockManagerMasterEndpoint: Registering block manager VEN10648.lan.huk-coburg.de:58229 with 2.2 GiB RAM, BlockManagerId(driver, VEN10648.lan.huk-coburg.de, 58229, None) 24/05/14 11:30:08 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, VEN10648.lan.huk-coburg.de, 58229, None) 24/05/14 11:30:08 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, VEN10648.lan.huk-coburg.de, 58229, None) 24/05/14 11:30:10 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties 24/05/14 11:30:10 INFO MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 24/05/14 11:30:10 INFO MetricsSystemImpl: s3a-file-system metrics system started 24/05/14 11:30:11 INFO DirectoryPolicyImpl: Directory markers will be kept isdir: true url: s3a://dev-product-external-adobe/ 24/05/14 11:30:11 INFO SparkContext: SparkContext is stopping with exitCode 0. 24/05/14 11:30:11 INFO SparkUI: Stopped Spark web UI at http://ven10648.lan.huk-coburg.de:4040/ 24/05/14 11:30:11 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 24/05/14 11:30:11 INFO MemoryStore: MemoryStore cleared 24/05/14 11:30:11 INFO BlockManager: BlockManager stopped 24/05/14 11:30:11 INFO BlockManagerMaster: BlockManagerMaster stopped 24/05/14 11:30:11 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 24/05/14 11:30:11 INFO SparkContext: Successfully stopped SparkContext 24/05/14 11:30:11 INFO SparkContext: SparkContext is stopping with exitCode 0. 24/05/14 11:30:11 INFO SparkContext: SparkContext already stopped. 24/05/14 11:30:11 INFO SparkContext: SparkContext is stopping with exitCode 0. 24/05/14 11:30:11 INFO SparkContext: SparkContext already stopped. 24/05/14 11:30:11 INFO SparkContext: SparkContext is stopping with exitCode 0. 24/05/14 11:30:11 INFO SparkContext: SparkContext already stopped. shutdown() 24/05/14 11:30:11 INFO SparkContext: Running Spark version 3.5.1 24/05/14 11:30:11 INFO SparkContext: OS info Windows 10, 10.0, amd64 24/05/14 11:30:11 INFO SparkContext: Java version 11.0.6 24/05/14 11:30:11 INFO ResourceUtils: ============================================================== 24/05/14 11:30:11 INFO ResourceUtils: No custom resources configured for spark.driver. 24/05/14 11:30:11 INFO ResourceUtils: ============================================================== 24/05/14 11:30:11 INFO SparkContext: Submitted application: Spark-Test 24/05/14 11:30:11 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 24/05/14 11:30:11 INFO ResourceProfile: Limiting resource is cpu 24/05/14 11:30:11 INFO ResourceProfileManager: Added ResourceProfile id: 0 24/05/14 11:30:11 INFO SecurityManager: Changing view acls to: ps60 24/05/14 11:30:11 INFO SecurityManager: Changing modify acls to: ps60 24/05/14 11:30:11 INFO SecurityManager: Changing view acls groups to: 24/05/14 11:30:11 INFO SecurityManager: Changing modify acls groups to: 24/05/14 11:30:11 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: ps60; groups with view permissions: EMPTY; users with modify permissions: ps60; groups with modify permissions: EMPTY 24/05/14 11:30:11 INFO Utils: Successfully started service 'sparkDriver' on port 58237. 24/05/14 11:30:11 INFO SparkEnv: Registering MapOutputTracker 24/05/14 11:30:11 INFO SparkEnv: Registering BlockManagerMaster 24/05/14 11:30:11 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 24/05/14 11:30:11 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 24/05/14 11:30:11 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 24/05/14 11:30:11 INFO DiskBlockManager: Created local directory at C:\Users\ps60\AppData\Local\Temp\blockmgr-45d3b868-80f2-4506-a79d-81753c3fcd66 24/05/14 11:30:11 INFO MemoryStore: MemoryStore started with capacity 2.2 GiB 24/05/14 11:30:11 INFO SparkEnv: Registering OutputCommitCoordinator 24/05/14 11:30:11 INFO JettyUtils: Start Jetty 0.0.0.0:4040 for SparkUI 24/05/14 11:30:11 INFO Utils: Successfully started service 'SparkUI' on port 4040. 24/05/14 11:30:11 INFO Executor: Starting executor ID driver on host VEN10648.lan.huk-coburg.de 24/05/14 11:30:11 INFO Executor: OS info Windows 10, 10.0, amd64 24/05/14 11:30:11 INFO Executor: Java version 11.0.6 24/05/14 11:30:11 INFO Executor: Starting executor with user classpath (userClassPathFirst = false): '' 24/05/14 11:30:11 INFO Executor: Created or updated repl class loader org.apache.spark.util.MutableURLClassLoader@44a44a04 for default. 24/05/14 11:30:11 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 58248. 24/05/14 11:30:11 INFO NettyBlockTransferService: Server created on VEN10648.lan.huk-coburg.de:58248 24/05/14 11:30:11 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 24/05/14 11:30:11 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, VEN10648.lan.huk-coburg.de, 58248, None) 24/05/14 11:30:11 INFO BlockManagerMasterEndpoint: Registering block manager VEN10648.lan.huk-coburg.de:58248 with 2.2 GiB RAM, BlockManagerId(driver, VEN10648.lan.huk-coburg.de, 58248, None) 24/05/14 11:30:11 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, VEN10648.lan.huk-coburg.de, 58248, None) 24/05/14 11:30:11 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, VEN10648.lan.huk-coburg.de, 58248, None) mkdir state: true for s3a://dev-product-external-adobe/testDirectoryIsDirectoryAfterMkdir created: s3a://dev-product-external-adobe/testDirectoryIsDirectoryAfterMkdir/testfile isdir: false url: s3a://dev-product-external-adobe/testDirectoryIsDirectoryAfterMkdir/testfile isdir: false url: s3a://dev-product-external-adobe/testDirectoryIsDirectoryAfterMkdir 24/05/14 11:30:12 INFO SparkContext: SparkContext is stopping with exitCode 0. 24/05/14 11:30:12 INFO SparkUI: Stopped Spark web UI at http://ven10648.lan.huk-coburg.de:4040/ 24/05/14 11:30:12 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 24/05/14 11:30:12 INFO MemoryStore: MemoryStore cleared 24/05/14 11:30:12 INFO BlockManager: BlockManager stopped 24/05/14 11:30:12 INFO BlockManagerMaster: BlockManagerMaster stopped 24/05/14 11:30:12 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 24/05/14 11:30:12 INFO SparkContext: Successfully stopped SparkContext 24/05/14 11:30:12 INFO SparkContext: SparkContext is stopping with exitCode 0. 24/05/14 11:30:12 INFO SparkContext: SparkContext already stopped. 24/05/14 11:30:12 INFO SparkContext: SparkContext is stopping with exitCode 0. 24/05/14 11:30:12 INFO SparkContext: SparkContext already stopped. 24/05/14 11:30:12 INFO SparkContext: SparkContext is stopping with exitCode 0. 24/05/14 11:30:12 INFO SparkContext: SparkContext already stopped. shutdown()

org.opentest4j.AssertionFailedError: Expected :true Actual :false

at org.junit.jupiter.api.AssertionFailureBuilder.build(AssertionFailureBuilder.java:151) at org.junit.jupiter.api.AssertionFailureBuilder.buildAndThrow(AssertionFailureBuilder.java:132) at org.junit.jupiter.api.AssertTrue.failNotTrue(AssertTrue.java:63) at org.junit.jupiter.api.AssertTrue.assertTrue(AssertTrue.java:36) at org.junit.jupiter.api.AssertTrue.assertTrue(AssertTrue.java:31) at org.junit.jupiter.api.Assertions.assertTrue(Assertions.java:180) at de.huk.bigdata.s3.dirs.NewNoobaDirCheckerTest.testDirectoryIsDirectoryAfterMkdir(NewNoobaDirCheckerTest.java:39) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:727) at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:156) at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:147) at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:86) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(InterceptingExecutableInvoker.java:103) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.lambda$invoke$0(InterceptingExecutableInvoker.java:93) at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.invoke(InterceptingExecutableInvoker.java:92) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.invoke(InterceptingExecutableInvoker.java:86) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:217) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:213) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:138) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:68) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1540) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1540) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:147) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:127) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:90) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:55) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:102) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:54) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:114) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:86) at org.junit.platform.launcher.core.DefaultLauncherSession$DelegatingLauncher.execute(DefaultLauncherSession.java:86) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:53) at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:57) at com.intellij.rt.junit.IdeaTestRunner$Repeater$1.execute(IdeaTestRunner.java:38) at com.intellij.rt.execution.junit.TestsRepeater.repeat(TestsRepeater.java:11) at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:35) at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:232) at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:55) 24/05/14 11:30:12 INFO ShutdownHookManager: Shutdown hook called 24/05/14 11:30:12 INFO ShutdownHookManager: Deleting directory C:\Users\ps60\AppData\Local\Temp\spark-97323503-31ed-4704-b598-431a7a1659fc 24/05/14 11:30:12 INFO MetricsSystemImpl: Stopping s3a-file-system metrics system... 24/05/14 11:30:12 INFO MetricsSystemImpl: s3a-file-system metrics system stopped. 24/05/14 11:30:12 INFO MetricsSystemImpl: s3a-file-system metrics system shutdown complete. ```
guymguym commented 4 months ago

@romayalon This is a followup to #7974

romayalon commented 4 months ago

@madhuthorat A few questions -

  1. Which command you used for creating the directory? Is this directory has content?
  2. You mentioned that the problem you see is 0 response in listing and 4096 when you list NooBaa - 2.1. Can you please attach the objects-list response of NooBaa? 2.2. 0 response means the size is 0 and when listing on NooBaa the size is 4096? (when trying it myself I see size 64 so I'm not sure what you see)
  3. The difference I observe is in the head object result actually, i'm providing a fix for it but please provide more info about the list objects result.
  4. NooBaa logs are always welcomed.

@guymguym why this is a follow-up to #7974? I think it's related to directory objects, no?

ps20renar commented 4 months ago

Hallo Romy, here are the answers to your qustions:

  1. the dir are createt from the spark client with s3A. yes has a entry.
  2. The main difference with nobba here we see that the head cmd response on a directory on scale with a result of 4096. And all other s3-Endpoints with 0:
    
    noobaa S3 with gpfs as backend FS:
    [ps60@cbd00032 ~]$ renar s3 ls dev-product-external-adobe/testDirectoryIsDirectoryAfterMkdir/
    2024-05-14 11:21:49       4096
    2024-05-14 11:21:49          0 testfile

Minio on hdfs: (the same with ozone or AWS s3) [ps60@cbd00032 ~]$ minio s3 ls dev-product-external-adobe/testDirectoryIsDirectoryAfterMkdir/ 2024-04-26 15:07:56 0 2024-04-26 15:07:56 0 testfile



3. you are right
4. logs are comming
ps20renar commented 4 months ago

@romayalon, here the logs

romayalon commented 4 months ago

@ps20renar thanks Renar for the detailed info, I found the issue and fixed it for list objects as well. @guymguym would be happy to get a review if you are available

madhuthorat commented 4 months ago

@romayalon After review, hoping the fix gets merged in 5.15.4. Thanks.

romayalon commented 4 months ago

Got approval on slack that it was validated, removing label