2023-11-25 07:30:32.149 ERROR [nio-8080-exec-5] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed: org.apache.kafka.common.KafkaException: Received exception when fetching the next record from Share-0. If needed, please seek past the record to continue consumption.] with root cause
java.lang.ExceptionInInitializerError: Exception org.xerial.snappy.SnappyError: pure-java snappy requires access to java.nio.Buffer raw address field [in thread "parallel-1"]
at org.xerial.snappy.pure.UnsafeUtil.<clinit>(UnsafeUtil.java:49)
at org.xerial.snappy.pure.SnappyRawDecompressor.getUnsignedByteSafe(SnappyRawDecompressor.java:323)
at org.xerial.snappy.pure.SnappyRawDecompressor.readUncompressedLength(SnappyRawDecompressor.java:288)
at org.xerial.snappy.pure.SnappyRawDecompressor.getUncompressedLength(SnappyRawDecompressor.java:42)
at org.xerial.snappy.pure.PureJavaSnappy.uncompressedLength(PureJavaSnappy.java:238)
at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:638)
at org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:145)
at org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:99)
at org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:59)
at org.apache.kafka.common.compress.SnappyFactory.wrapForInput(SnappyFactory.java:44)
at org.apache.kafka.common.record.CompressionType$3.wrapForInput(CompressionType.java:94)
at org.apache.kafka.common.record.DefaultRecordBatch.recordInputStream(DefaultRecordBatch.java:276)
at org.apache.kafka.common.record.DefaultRecordBatch.compressedIterator(DefaultRecordBatch.java:280)
at org.apache.kafka.common.record.DefaultRecordBatch.streamingIterator(DefaultRecordBatch.java:364)
at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.nextFetchedRecord(Fetcher.java:1619)
at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.fetchRecords(Fetcher.java:1656)
at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.access$1900(Fetcher.java:1497)
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:717)
at org.apache.kafka.clients.consumer.internals.Fetcher.collectFetch(Fetcher.java:683)
at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1287)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1243)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1216)
at cn.typesafe.km.service.MessageService.lambda$liveData$3(MessageService.java:137)
at reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:106)
at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:113)
at reactor.core.publisher.FluxInterval$IntervalRunnable.run(FluxInterval.java:125)
at reactor.core.scheduler.PeriodicWorkerTask.call(PeriodicWorkerTask.java:59)
at reactor.core.scheduler.PeriodicWorkerTask.run(PeriodicWorkerTask.java:73)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.runAndReset(Unknown Source)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
远程aws kafka是能够连上的,但是拉取数据时,会报错。请教这个是什么问题