apache / plc4x

PLC4X The Industrial IoT adapter
https://plc4x.apache.org/
Apache License 2.0
1.23k stars 397 forks source link

ADS connection issue, Help wanted #598

Closed ottlukas closed 6 months ago

ottlukas commented 1 year ago

I ran twincat simulator on my local host machine with ip 192.168.x.x subnetwork and the similator has the ip address 172.21.97.81, and then i have used the ads server connection string: ads:tcp://localhost/172.21.97.81.1.1:851, which seems not to be connected and i receive the error message as shown in the logs. Can someone point out what the problem is or the bug is ?

 

Best Regards

Vikram Gopu 

 

 

 

[main] INFO org.apache.plc4x.java.PlcDriverManager - Instantiating new PLC Driver Manager with class loader jdk.internal.loader.ClassLoaders$AppClassLoader@2626b418[main] INFO org.apache.plc4x.java.PlcDriverManager - Instantiating new PLC Driver Manager with class loader jdk.internal.loader.ClassLoaders$AppClassLoader@2626b418[main] INFO org.apache.plc4x.java.PlcDriverManager - Registering available drivers...[main] INFO org.apache.plc4x.java.PlcDriverManager - Registering driver for Protocol modbus (Modbus (TCP / Serial))[main] INFO org.apache.plc4x.java.PlcDriverManager - Registering driver for Protocol s7 (Siemens S7 (Basic))[main] INFO org.apache.plc4x.java.PlcDriverManager - Registering driver for Protocol ads (Beckhoff Twincat ADS)[main] INFO org.apache.plc4x.java.scraper.config.triggeredscraper.ScraperConfigurationTriggeredImpl - Assuming job as triggered job because triggerConfig has been set[main] INFO org.apache.plc4x.java.scraper.triggeredscraper.TriggeredScraperImpl - Starting jobs...[main] INFO org.apache.plc4x.java.scraper.triggeredscraper.TriggeredScraperImpl - Task TriggeredScraperTask{driverManager=org.apache.plc4x.java.utils.connectionpool.PooledPlcDriverManager@4b9e255, jobName='ScheduleJob', connectionAlias='DeviceSource', connectionString='ads:tcp://localhost/172.21.97.81.1.1:851', requestTimeoutMs=1000, executorService=java.util.concurrent.ThreadPoolExecutor@5e57643e[Running, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0], resultHandler=eu.cloudplug.cpe.plc4x.PLC4XScrapper$$Lambda$67/0x0000000800bcac40@133e16fd, triggerHandler=org.apache.plc4x.java.scraper.triggeredscraper.triggerhandler.TriggerHandlerImpl@51b279c9} added to scheduling[triggeredscraper-scheduling-thread-1] WARN org.apache.plc4x.java.scraper.triggeredscraper.TriggeredScraperTask - Exception during scraping of Job ScheduleJob, Connection-Alias DeviceSource: Error-message: null - for stack-trace change logging to DEBUG[nioEventLoopGroup-3-1] WARN io.netty.channel.DefaultChannelPipeline - An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.io.netty.handler.codec.DecoderException: java.lang.IndexOutOfBoundsException at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:98) at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1421) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:697) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:632) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:549) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:511) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:918) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base/java.lang.Thread.run(Thread.java:830)Caused by: java.lang.IndexOutOfBoundsException at io.netty.buffer.EmptyByteBuf.readUnsignedIntLE(EmptyByteBuf.java:594) at org.apache.plc4x.java.ads.api.util.UnsignedIntLEByteValue.<init>(UnsignedIntLEByteValue.java:53) at org.apache.plc4x.java.ads.api.commands.types.Result.<init>(Result.java:43) at org.apache.plc4x.java.ads.api.commands.types.Result.of(Result.java:59) at org.apache.plc4x.java.ads.protocol.Ads2PayloadProtocol.handleADSReadWriteCommand(Ads2PayloadProtocol.java:367) at org.apache.plc4x.java.ads.protocol.Ads2PayloadProtocol.decode(Ads2PayloadProtocol.java:135) at org.apache.plc4x.java.ads.protocol.Ads2PayloadProtocol.decode(Ads2PayloadProtocol.java:42) at io.netty.handler.codec.MessageToMessageCodec$2.decode(MessageToMessageCodec.java:81) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88) ... 22 more

Imported from Jira PLC4X-217. Original Jira may contain additional context. Reported by: vikram919.

chrisdutz commented 6 months ago

The ADS driver has been completely rewritten since the 0.6.0 this issue was reported for.