Open paulianttila opened 7 years ago
Although this is really strange behaviour, does it help to split the 5k line channel-type definition into smaller chunks? Does parsing on start up succeed every time, and parsing during runtime deployment fail every time?
Does parsing on start up succeed every time, and parsing during runtime deployment fail every time?
Yes, I have tested it over 10 times. I meet this problem half an year ago, when I started to migrate my OH1 system to OH2. I updated the latest snapshot yesterday with the same result. I run my "production" system in NUC (Celeron J3455). I have pretty busy production system. Currently 28 bindings + 23 UI, persistence, transformation, misc, etc addons. Around 1000 items.
I tried to reproduce the problem with clean OH install (latest snapshot) and just this one problematic binding. Problem does not occur in this environment (MacBook Pro 2,5 GHz Intel Core i7).
does it help to split the 5k line channel-type definition into smaller chunks?
I have not tested. But I remember that when I first meat the problem I quickly generated thin version, which only had channels which I really needed (around 25 channels). Full file contains over 700 channels. With this thin version I didn't see this problem.
Just a wild guess but this may also be a memory issue: You could try to give your openHAB instance some more memory?
btw: what Java version does your openHAB run on? could you post the output of java --version
from the NUC system?
I have installed openHAB via openHABian. Java is updated many times. I will increase memory next time when I need to restart the OH, even I have not seen (ever) any OutOfMemory exceptions.
openhab> shell:info 19:20:18
Karaf
Karaf version 4.1.2
Karaf home /usr/share/openhab2/runtime
Karaf base /var/lib/openhab2
OSGi Framework org.eclipse.osgi-3.11.3.v20170209-1843
JVM
Java Virtual Machine Java HotSpot(TM) 64-Bit Server VM version 25.144-b01
Version 1.8.0_144
Vendor Oracle Corporation
Pid 10308
Uptime 4 days 22 hours
Total compile time 11 minutes
Threads
Live threads 271
Daemon threads 130
Peak 274
Total started 2516093
Memory
Current heap size 195,858 kbytes
Maximum heap size 974,848 kbytes
Committed heap size 249,856 kbytes
Pending objects 0
Garbage collector Name = 'G1 Young Generation', Collections = 151779, Time = 43 minutes
Garbage collector Name = 'G1 Old Generation', Collections = 0, Time = 0.000 seconds
Classes
Current classes loaded 24,429
Total classes loaded 1,285,010
Total classes unloaded 1,260,581
Operating system
Name Linux version 4.4.0-96-generic
Architecture amd64
Processors 4
Yes, memory looks good from the logs. But the difference between parsing at startup and at runtime points to some environmental effects. What about splitting the channel definitions into several files: If this fixes the issue right now maybe its the most easy way to overcome this situation. Is there a logical difference in channel types which allows sane splitting?
Just increased the OH memory
Memory
Current heap size 594,944 kbytes
Maximum heap size 2,097,152 kbytes
Committed heap size 1,048,576 kbytes
It didn't solve the problem. If I remove and then add jar file to addon folder, I see the same parsing failure. After failure, if I stop and then start the bundle from osgi console it start without parsing problems.
Really weird.
I have developed OH binding which have pretty big thing channel type definition file. XML file contains over 5000 lines. Whole file can be found here f1x45-types.xml. During OH start, binding is successfully loaded and all thing type files are correctly parsed, but if I deploy binding to OH in run time (adding jar to addon directory when OH is already running), xml file parsing fails.
Error line varies