ControlSystemStudio / phoebus

A framework and set of tools to monitor and operate large scale control systems, such as the ones in the accelerator community.
http://phoebus.org/
Eclipse Public License 1.0
90 stars 90 forks source link

Menu Problem with X11 Forwarding on Windows #956

Open dxmaxwell opened 4 years ago

dxmaxwell commented 4 years ago

When displaying Phoebus remotely via X11 forwarding the menus are glitchy. Not only does the menu appear "detached" from the menu bar as shown in the screenshot below, but it is also tricky to make the menu appears. The user must click down on the menu title, then move the mouse below the menu title and release the click.

Phoebus_Menu_Glitch_VcXsrv

Client: Windows 10 with Putty and VcXsrv (also tried with Xming and Cygwin/X) Host: Debian Buster with a recent build of Phoebus

Note that X11 forwarding using a Linux Mint client instead of Windows10 works as expected.

From the log output it appears the UI is freezing:

maxwelld@csstudio2:~$ /usr/lib/jvm/java-11-openjdk-amd64/bin/java -Dprism.verbose=true  -jar /usr/share/phoebus/product-4.6.0-SNAPSHOT.jar -nosplash -settings /etc/phoebus/phoebus_settings.ini
2019-11-18 16:57:04 INFO [org.phoebus.product.Launcher] Loading settings from /etc/phoebus/phoebus_settings.ini
2019-11-18 16:57:04 INFO [org.phoebus.product.Launcher] Phoebus (PID 6167)
Prism pipeline init order: es2 sw
Using Double Precision Marlin Rasterizer
Using dirty region optimizations
Not using texture mask for primitives
Not forcing power of 2 sizes for textures
Using hardware CLAMP_TO_ZERO mode
Opting in for HiDPI pixel scaling
Prism pipeline name = com.sun.prism.es2.ES2Pipeline
Loading ES2 native library ... prism_es2
        succeeded.
GLFactory using com.sun.prism.es2.X11GLFactory
(X) Got class = class com.sun.prism.es2.ES2Pipeline
Failed Graphics Hardware Qualifier check.
System GPU doesn't meet the es2 pipe requirement
GraphicsPipeline.createPipeline: error initializing pipeline com.sun.prism.es2.ES2Pipeline
*** Fallback to Prism SW pipeline
Prism pipeline name = com.sun.prism.sw.SWPipeline
(X) Got class = class com.sun.prism.sw.SWPipeline
Initialized prism pipeline: com.sun.prism.sw.SWPipeline
 vsync: true vpipe: false
2019-11-18 16:57:05 INFO [org.phoebus.channelfinder.ChannelFinderClient] Creating a channelfinder client to : http://localhost:8080/ChannelFinder
2019-11-18 16:57:06 INFO [org.phoebus.applications.alarm.logging.ui.AlarmLogTableApp] ES Sniff feature is enabled
16:57:06.051 [es_rest_client_sniffer[T#1]] DEBUG org.apache.http.impl.nio.client.MainClientExec - [exchange: 1] start execution
16:57:06.071 [es_rest_client_sniffer[T#1]] DEBUG org.apache.http.client.protocol.RequestAddCookies - CookieSpec selected: default
16:57:06.081 [es_rest_client_sniffer[T#1]] DEBUG org.apache.http.client.protocol.RequestAuthCache - Re-using cached 'basic' auth scheme for http://facility-es01.cts:9200
16:57:06.081 [es_rest_client_sniffer[T#1]] DEBUG org.apache.http.client.protocol.RequestAuthCache - No credentials for preemptive authentication
16:57:06.082 [es_rest_client_sniffer[T#1]] DEBUG org.apache.http.impl.nio.client.InternalHttpAsyncClient - [exchange: 1] Request connection for {}->http://facility-es01.cts:9200
16:57:06.083 [es_rest_client_sniffer[T#1]] DEBUG org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager - Connection request: [route: {}->http://facility-es01.cts:9200][total kept alive: 0; route allocated: 0 of 10; total allocated: 0 of 30]
16:57:06.115 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager - Connection leased: [id: http-outgoing-0][route: {}->http://facility-es01.cts:9200][total kept alive: 0; route allocated: 1 of 10; total allocated: 1 of 30]
16:57:06.117 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.InternalHttpAsyncClient - [exchange: 1] Connection allocated: CPoolProxy{http-outgoing-0 [ACTIVE]}
16:57:06.117 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][r:]: Set attribute http.nio.exchange-handler
16:57:06.117 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][rw:]: Event set [w]
16:57:06.117 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][rw:]: Set timeout 0
16:57:06.117 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.InternalIODispatch - http-outgoing-0 [ACTIVE]: Connected
16:57:06.118 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][rw:]: Set attribute http.nio.http-exchange-state
16:57:06.118 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.InternalHttpAsyncClient - Start connection routing
16:57:06.119 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.MainClientExec - Connection route established
16:57:06.119 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.MainClientExec - [exchange: 1] Attempt 1 to execute request
16:57:06.120 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.MainClientExec - Target auth state: UNCHALLENGED
16:57:06.120 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.MainClientExec - Proxy auth state: UNCHALLENGED
16:57:06.120 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][rw:]: Set timeout 30000
16:57:06.120 [I/O dispatcher 1] DEBUG org.apache.http.headers - http-outgoing-0 >> GET /_nodes/http?timeout=1000ms HTTP/1.1
16:57:06.121 [I/O dispatcher 1] DEBUG org.apache.http.headers - http-outgoing-0 >> Content-Length: 0
16:57:06.121 [I/O dispatcher 1] DEBUG org.apache.http.headers - http-outgoing-0 >> Host: facility-es01.cts:9200
16:57:06.121 [I/O dispatcher 1] DEBUG org.apache.http.headers - http-outgoing-0 >> Connection: Keep-Alive
16:57:06.121 [I/O dispatcher 1] DEBUG org.apache.http.headers - http-outgoing-0 >> User-Agent: Apache-HttpAsyncClient/4.1.2 (Java/11.0.4)
16:57:06.122 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][rw:]: Event set [w]
16:57:06.122 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.MainClientExec - [exchange: 1] Request completed
16:57:06.123 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][rw:w]: 173 bytes written
16:57:06.123 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 >> "GET /_nodes/http?timeout=1000ms HTTP/1.1[\r][\n]"
16:57:06.123 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 >> "Content-Length: 0[\r][\n]"
16:57:06.123 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 >> "Host: facility-es01.cts:9200[\r][\n]"
16:57:06.123 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 >> "Connection: Keep-Alive[\r][\n]"
16:57:06.123 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 >> "User-Agent: Apache-HttpAsyncClient/4.1.2 (Java/11.0.4)[\r][\n]"
16:57:06.124 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 >> "[\r][\n]"
16:57:06.124 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.InternalIODispatch - http-outgoing-0 [ACTIVE] Request ready
16:57:06.124 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][r:w]: Event cleared [w]
16:57:06.124 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][r:r]: 2115 bytes read
16:57:06.125 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 << "HTTP/1.1 200 OK[\r][\n]"
16:57:06.125 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 << "content-type: application/json; charset=UTF-8[\r][\n]"
16:57:06.125 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 << "content-length: 2027[\r][\n]"
16:57:06.125 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 << "[\r][\n]"
16:57:06.125 [I/O dispatcher 1] DEBUG org.apache.http.wire - http-outgoing-0 << "{"_nodes":{"total":4,"successful":4,"failed":0},"cluster_name":"facility-escluster","nodes":{"L2ptQ0MbScaYSs0H1IcaGw":{"name":"facility-es03.cts","transport_address":"10.41.8.53:9300","host":"10.41.8.53","ip":"10.41.8.53","version":"6.8.2","build_flavor":"default","build_type":"deb","build_hash":"b506955","roles":["master","data","ingest"],"attributes":{"ml.machine_memory":"8366428160","ml.max_open_jobs":"20","xpack.installed":"true","ml.enabled":"true"},"http":{"bound_address":["[::]:9200"],"publish_address":"10.41.8.53:9200","max_content_length_in_bytes":104857600}},"KlxBwuxFRDSOuNkYLFwPWQ":{"name":"facility-kibana.cts","transport_address":"10.41.8.54:9300","host":"10.41.8.54","ip":"10.41.8.54","version":"6.8.2","build_flavor":"default","build_type":"deb","build_hash":"b506955","roles":[],"attributes":{"ml.machine_memory":"4138766336","ml.max_open_jobs":"20","xpack.installed":"true","ml.enabled":"true"},"http":{"bound_address":["127.0.0.1:9200","[::1]:9200"],"publish_address":"localhost/127.0.0.1:9200","max_content_length_in_bytes":104857600}},"q5MAaLsfRwGTwcc_TpSrDw":{"name":"facility-es01.cts","transport_address":"10.41.8.51:9300","host":"10.41.8.51","ip":"10.41.8.51","version":"6.8.2","build_flavor":"default","build_type":"deb","build_hash":"b506955","roles":["master","data","ingest"],"attributes":{"ml.machine_memory":"8366419968","xpack.installed":"true","ml.max_open_jobs":"20","ml.enabled":"true"},"http":{"bound_address":["[::]:9200"],"publish_address":"10.41.8.51:9200","max_content_length_in_bytes":104857600}},"rMjWA1qoRKWk2Rj3ZjEqbA":{"name":"facility-es02.cts","transport_address":"10.41.8.52:9300","host":"10.41.8.52","ip":"10.41.8.52","version":"6.8.2","build_flavor":"default","build_type":"deb","build_hash":"b506955","roles":["master","data","ingest"],"attributes":{"ml.machine_memory":"8366428160","ml.max_open_jobs":"20","xpack.installed":"true","ml.enabled":"true"},"http":{"bound_address":["[::]:9200"],"publish_address":"10.41.8.52:9200","max_content_length_in_bytes":104857600}}}}"
16:57:06.129 [I/O dispatcher 1] DEBUG org.apache.http.headers - http-outgoing-0 << HTTP/1.1 200 OK
16:57:06.129 [I/O dispatcher 1] DEBUG org.apache.http.headers - http-outgoing-0 << content-type: application/json; charset=UTF-8
16:57:06.129 [I/O dispatcher 1] DEBUG org.apache.http.headers - http-outgoing-0 << content-length: 2027
16:57:06.130 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.InternalIODispatch - http-outgoing-0 [ACTIVE(2027)] Response received
16:57:06.130 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.MainClientExec - [exchange: 1] Response received HTTP/1.1 200 OK
16:57:06.133 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.InternalIODispatch - http-outgoing-0 [ACTIVE(2027)] Input ready
16:57:06.133 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.MainClientExec - [exchange: 1] Consume content
16:57:06.133 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.InternalHttpAsyncClient - [exchange: 1] Connection can be kept alive indefinitely
16:57:06.134 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.MainClientExec - [exchange: 1] Response processed
16:57:06.134 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.InternalHttpAsyncClient - [exchange: 1] releasing connection
16:57:06.135 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][r:r]: Remove attribute http.nio.exchange-handler
16:57:06.135 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager - Releasing connection: [id: http-outgoing-0][route: {}->http://facility-es01.cts:9200][total kept alive: 0; route allocated: 1 of 10; total allocated: 1 of 30]
16:57:06.135 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager - Connection [id: http-outgoing-0][route: {}->http://facility-es01.cts:9200] can be kept alive indefinitely
16:57:06.135 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.ManagedNHttpClientConnectionImpl - http-outgoing-0 10.41.200.11:32894<->10.41.8.51:9200[ACTIVE][r:r]: Set timeout 0
16:57:06.135 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager - Connection released: [id: http-outgoing-0][route: {}->http://facility-es01.cts:9200][total kept alive: 1; route allocated: 1 of 10; total allocated: 1 of 30]
16:57:06.136 [I/O dispatcher 1] DEBUG org.elasticsearch.client.RestClient - request [GET http://facility-es01.cts:9200/_nodes/http?timeout=1000ms] returned [HTTP/1.1 200 OK]
16:57:06.147 [I/O dispatcher 1] DEBUG org.apache.http.impl.nio.client.InternalIODispatch - http-outgoing-0 [ACTIVE] [content length: 2027; pos: 2027; completed: true]
16:57:06.149 [es_rest_client_sniffer[T#1]] DEBUG org.elasticsearch.client.sniff.Sniffer - sniffed nodes: [[host=http://10.41.8.53:9200, bound=[http://[::]:9200], name=facility-es03.cts, version=6.8.2, roles=mdi, attributes={ml.machine_memory=[8366428160], ml.max_open_jobs=[20], xpack.installed=[true], ml.enabled=[true]}], [host=http://localhost, bound=[http://127.0.0.1:9200, http://[::1]:9200], name=facility-kibana.cts, version=6.8.2, roles=, attributes={ml.machine_memory=[4138766336], ml.max_open_jobs=[20], xpack.installed=[true], ml.enabled=[true]}], [host=http://10.41.8.51:9200, bound=[http://[::]:9200], name=facility-es01.cts, version=6.8.2, roles=mdi, attributes={ml.machine_memory=[8366419968], ml.max_open_jobs=[20], xpack.installed=[true], ml.enabled=[true]}], [host=http://10.41.8.52:9200, bound=[http://[::]:9200], name=facility-es02.cts, version=6.8.2, roles=mdi, attributes={ml.machine_memory=[8366428160], ml.max_open_jobs=[20], xpack.installed=[true], ml.enabled=[true]}]]
2019-11-18 16:57:55 SEVERE [org.phoebus.ui.application.PhoebusApplication] UI Freezeup

"main" prio=5 Id=1 WAITING on java.util.concurrent.CountDownLatch$Sync@151be9f3
        at java.base@11.0.4/jdk.internal.misc.Unsafe.park(Native Method)
        -  waiting on java.util.concurrent.CountDownLatch$Sync@151be9f3
        at java.base@11.0.4/java.util.concurrent.locks.LockSupport.park(LockSupport.java:194)
        at java.base@11.0.4/java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:885)
        at java.base@11.0.4/java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1039)
        at java.base@11.0.4/java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1345)
        at java.base@11.0.4/java.util.concurrent.CountDownLatch.await(CountDownLatch.java:232)
        at app//com.sun.javafx.application.LauncherImpl.launchApplication(LauncherImpl.java:213)
        at app//com.sun.javafx.application.LauncherImpl.launchApplication(LauncherImpl.java:156)
        ...

"Reference Handler" daemon prio=10 Id=2 RUNNABLE
        at java.base@11.0.4/java.lang.ref.Reference.waitForReferencePendingList(Native Method)
        at java.base@11.0.4/java.lang.ref.Reference.processPendingReferences(Reference.java:241)
        at java.base@11.0.4/java.lang.ref.Reference$ReferenceHandler.run(Reference.java:213)

"Finalizer" daemon prio=8 Id=3 WAITING on java.lang.ref.ReferenceQueue$Lock@7e1bf607
        at java.base@11.0.4/java.lang.Object.wait(Native Method)
        -  waiting on java.lang.ref.ReferenceQueue$Lock@7e1bf607
        at java.base@11.0.4/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
        at java.base@11.0.4/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:176)
        at java.base@11.0.4/java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:170)

"Signal Dispatcher" daemon prio=9 Id=4 RUNNABLE

"Common-Cleaner" daemon prio=8 Id=10 TIMED_WAITING on java.lang.ref.ReferenceQueue$Lock@678d5e7d
        at java.base@11.0.4/java.lang.Object.wait(Native Method)
        -  waiting on java.lang.ref.ReferenceQueue$Lock@678d5e7d
        at java.base@11.0.4/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
        at java.base@11.0.4/jdk.internal.ref.CleanerImpl.run(CleanerImpl.java:148)
        at java.base@11.0.4/java.lang.Thread.run(Thread.java:834)
        at java.base@11.0.4/jdk.internal.misc.InnocuousThread.run(InnocuousThread.java:134)

"Timer-0" daemon prio=5 Id=12 TIMED_WAITING on java.util.TaskQueue@5bcbbeba
        at java.base@11.0.4/java.lang.Object.wait(Native Method)
        -  waiting on java.util.TaskQueue@5bcbbeba
        at java.base@11.0.4/java.util.TimerThread.mainLoop(Timer.java:553)
        at java.base@11.0.4/java.util.TimerThread.run(Timer.java:506)

"JavaFX-Launcher" prio=5 Id=14 WAITING on java.util.concurrent.CountDownLatch$Sync@11559041
        at java.base@11.0.4/jdk.internal.misc.Unsafe.park(Native Method)
        -  waiting on java.util.concurrent.CountDownLatch$Sync@11559041
        at java.base@11.0.4/java.util.concurrent.locks.LockSupport.park(LockSupport.java:194)
        at java.base@11.0.4/java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:885)
        at java.base@11.0.4/java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1039)
        at java.base@11.0.4/java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1345)
        at java.base@11.0.4/java.util.concurrent.CountDownLatch.await(CountDownLatch.java:232)
        at app//com.sun.javafx.application.LauncherImpl.launchApplication1(LauncherImpl.java:856)
        at app//com.sun.javafx.application.LauncherImpl.lambda$launchApplication$2(LauncherImpl.java:195)
        ...

"QuantumRenderer-0" daemon prio=5 Id=15 WAITING on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@305fe7a2
        at java.base@11.0.4/jdk.internal.misc.Unsafe.park(Native Method)
        -  waiting on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@305fe7a2
        at java.base@11.0.4/java.util.concurrent.locks.LockSupport.park(LockSupport.java:194)
        at java.base@11.0.4/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2081)
        at java.base@11.0.4/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at app//com.sun.javafx.tk.quantum.QuantumRenderer$PipelineRunnable.run(QuantumRenderer.java:125)
        ...

"InvokeLaterDispatcher" daemon prio=5 Id=17 WAITING on java.lang.StringBuilder@3d2c2962
        at java.base@11.0.4/java.lang.Object.wait(Native Method)
        -  waiting on java.lang.StringBuilder@3d2c2962
        at java.base@11.0.4/java.lang.Object.wait(Object.java:328)
        at app//com.sun.glass.ui.InvokeLaterDispatcher.run(InvokeLaterDispatcher.java:127)

*********************************
*** JavaFX Application Thread ***
*********************************
"JavaFX Application Thread" prio=5 Id=18 RUNNABLE
        at app//com.sun.glass.ui.gtk.GtkApplication._runLoop(Native Method)
        at app//com.sun.glass.ui.gtk.GtkApplication.lambda$runLoop$11(GtkApplication.java:277)
        at app//com.sun.glass.ui.gtk.GtkApplication$$Lambda$86/0x0000000800163040.run(Unknown Source)
        at java.base@11.0.4/java.lang.Thread.run(Thread.java:834)

"JobManager" daemon prio=5 Id=19 TIMED_WAITING on java.util.concurrent.SynchronousQueue$TransferStack@3b147e13
        at java.base@11.0.4/jdk.internal.misc.Unsafe.park(Native Method)
        -  waiting on java.util.concurrent.SynchronousQueue$TransferStack@3b147e13
        at java.base@11.0.4/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@11.0.4/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@11.0.4/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@11.0.4/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1053)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        ...

"JobManager2" daemon prio=5 Id=20 TIMED_WAITING on java.util.concurrent.SynchronousQueue$TransferStack@3b147e13
        at java.base@11.0.4/jdk.internal.misc.Unsafe.park(Native Method)
        -  waiting on java.util.concurrent.SynchronousQueue$TransferStack@3b147e13
        at java.base@11.0.4/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@11.0.4/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@11.0.4/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@11.0.4/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1053)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        ...

"pool-3-thread-1" prio=5 Id=21 RUNNABLE (in native)
        at java.base@11.0.4/sun.nio.ch.EPoll.wait(Native Method)
        at java.base@11.0.4/sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:120)
        at java.base@11.0.4/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        -  locked sun.nio.ch.Util$2@6fe36a8
        -  locked sun.nio.ch.EPollSelectorImpl@1568e453
        at java.base@11.0.4/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:340)
        at app//org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:192)
        at app//org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64)
        at java.base@11.0.4/java.lang.Thread.run(Thread.java:834)

"I/O dispatcher 1" prio=5 Id=22 RUNNABLE (in native)
        at java.base@11.0.4/sun.nio.ch.EPoll.wait(Native Method)
        at java.base@11.0.4/sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:120)
        at java.base@11.0.4/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        -  locked sun.nio.ch.Util$2@d14f40b
        -  locked sun.nio.ch.EPollSelectorImpl@7aa8c877
        at java.base@11.0.4/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:255)
        at app//org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104)
        at app//org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:588)
        at java.base@11.0.4/java.lang.Thread.run(Thread.java:834)

"I/O dispatcher 2" prio=5 Id=23 RUNNABLE (in native)
        at java.base@11.0.4/sun.nio.ch.EPoll.wait(Native Method)
        at java.base@11.0.4/sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:120)
        at java.base@11.0.4/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        -  locked sun.nio.ch.Util$2@2eb62eb3
        -  locked sun.nio.ch.EPollSelectorImpl@7c2e0d4a
        at java.base@11.0.4/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:255)
        at app//org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104)
        at app//org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:588)
        at java.base@11.0.4/java.lang.Thread.run(Thread.java:834)

"I/O dispatcher 3" prio=5 Id=24 RUNNABLE (in native)
        at java.base@11.0.4/sun.nio.ch.EPoll.wait(Native Method)
        at java.base@11.0.4/sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:120)
        at java.base@11.0.4/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        -  locked sun.nio.ch.Util$2@56301a68
        -  locked sun.nio.ch.EPollSelectorImpl@6a679ba9
        at java.base@11.0.4/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:255)
        at app//org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104)
        at app//org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:588)
        at java.base@11.0.4/java.lang.Thread.run(Thread.java:834)

"I/O dispatcher 4" prio=5 Id=25 RUNNABLE (in native)
        at java.base@11.0.4/sun.nio.ch.EPoll.wait(Native Method)
        at java.base@11.0.4/sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:120)
        at java.base@11.0.4/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        -  locked sun.nio.ch.Util$2@75bbe3e0
        -  locked sun.nio.ch.EPollSelectorImpl@79f3f440
        at java.base@11.0.4/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:255)
        at app//org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104)
        at app//org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:588)
        at java.base@11.0.4/java.lang.Thread.run(Thread.java:834)

"es_rest_client_sniffer[T#1]" daemon prio=5 Id=26 TIMED_WAITING on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@2f46424c
        at java.base@11.0.4/jdk.internal.misc.Unsafe.park(Native Method)
        -  waiting on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@2f46424c
        at java.base@11.0.4/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@11.0.4/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2123)
        at java.base@11.0.4/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182)
        at java.base@11.0.4/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        ...

"JobManager3" daemon prio=5 Id=27 TIMED_WAITING on java.util.concurrent.SynchronousQueue$TransferStack@3b147e13
        at java.base@11.0.4/jdk.internal.misc.Unsafe.park(Native Method)
        -  waiting on java.util.concurrent.SynchronousQueue$TransferStack@3b147e13
        at java.base@11.0.4/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@11.0.4/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@11.0.4/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@11.0.4/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1053)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        ...

"JobViewer" daemon prio=5 Id=28 TIMED_WAITING on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@79aa2ede
        at java.base@11.0.4/jdk.internal.misc.Unsafe.park(Native Method)
        -  waiting on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@79aa2ede
        at java.base@11.0.4/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@11.0.4/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2123)
        at java.base@11.0.4/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182)
        at java.base@11.0.4/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114)
        at java.base@11.0.4/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        ...

"Prism Font Disposer" daemon prio=10 Id=29 WAITING on java.lang.ref.ReferenceQueue$Lock@53a466e8
        at java.base@11.0.4/java.lang.Object.wait(Native Method)
        -  waiting on java.lang.ref.ReferenceQueue$Lock@53a466e8
        at java.base@11.0.4/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
        at java.base@11.0.4/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:176)
        at app//com.sun.javafx.font.Disposer.run(Disposer.java:93)
        at java.base@11.0.4/java.lang.Thread.run(Thread.java:834)

"Disposer" daemon prio=10 Id=30 WAITING on java.lang.ref.ReferenceQueue$Lock@3513d928
        at java.base@11.0.4/java.lang.Object.wait(Native Method)
        -  waiting on java.lang.ref.ReferenceQueue$Lock@3513d928
        at java.base@11.0.4/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
        at java.base@11.0.4/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:176)
        at app//com.sun.webkit.Disposer.run(Disposer.java:122)
        at java.base@11.0.4/java.lang.Thread.run(Thread.java:834)

"Cleaner-0" daemon prio=8 Id=34 TIMED_WAITING on java.lang.ref.ReferenceQueue$Lock@285aa9e4
        at java.base@11.0.4/java.lang.Object.wait(Native Method)
        -  waiting on java.lang.ref.ReferenceQueue$Lock@285aa9e4
        at java.base@11.0.4/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
        at java.base@11.0.4/jdk.internal.ref.CleanerImpl.run(CleanerImpl.java:148)
        at java.base@11.0.4/java.lang.Thread.run(Thread.java:834)
        at java.base@11.0.4/jdk.internal.misc.InnocuousThread.run(InnocuousThread.java:134)

2019-11-18 16:57:55 SEVERE [org.phoebus.ui.application.PhoebusApplication] UI Updates resume
kasemir commented 4 years ago

Nice, concise description! Maybe duplicate of #640?

kasemir commented 4 years ago

Or https://github.com/kasemir/org.csstudio.display.builder/issues/226?

dxmaxwell commented 4 years ago

@kasemir Yes, very concise...LOL still learning to use Github I guess.

dxmaxwell commented 4 years ago

Looks similar to #640, but I don't see the problem with x2go, only with X forwarding, and its only on Windows clients.

kasemir commented 4 years ago

Any difference using GTK2 vs. GTK3? -Djdk.gtk.version=2 resp. 3?

In general, X forwarding is for us slower than 'Thinlinc' type remote-X-access in the web browser, which in turn is MUCH more convenient for MS Windows users, since there's no need to dink with installing putty and X.

dxmaxwell commented 4 years ago

@shroffk mentions a problem with X forwarding #436

dxmaxwell commented 4 years ago

As mentioned in other issues (and across the internet), I tried using various options: -Dprism.order=sw -Djdk.gtk.version=2 -Dprism.forceGPU=true

But no change. It seems to use the Software rendering by default anyway, and that works perfectly well when the client machine is Linux.

dxmaxwell commented 4 years ago

@kasemir I did not know about Thinlinc, thanks. I will look into that more, but I would need to convince the IT dept to support it.

kasemir commented 4 years ago

Some comments on Java UI via X forwarding being slow are on https://stackoverflow.com/questions/26002948/workaround-for-slow-java-swing-menus, way back with Swing.

For me it helps to add ssh settings to disable compression or use a faster (less secure) encryption method, but buying Thinlinc is overall a more convenient option if you need remote access to a Linux desktop.

kasemir commented 4 years ago

I'm afraid we have to mark this as "won't fix". We acknowledge that X forwarding is slow and doesn't always work 100%. X forwarding basically sends every drawing command over the network. That was OK back when X11 Athena Widgets only had a black rectangle around a text field. It's slow for our present reality where every widget comes with shaded outlines and backgrounds. I can confirm that menus can take a long time (seconds) to show up when running via one or more ssh -X hops. We have several reports of other quirks as already linked from this issue. I don't think we have a way to fix them, because the issue is in the underlying X/ssh mechanism. For example, I get the same sluggishness from "gedit" via remote-X.

If a site depends on running via remote desktops, X11-via-ssh can be used to some extend, but Thinlinc or VNC are now a better approach.

Alternatively, phoebus does build for Windows, Mac, Linux, so users can run it locally with full performance. EPICS offers CA/PVA gateways to allow (read) access to data, and we support http://.. for display access. That does of course require site support for installing CS-Studio on end user computers.

Finally, we have a web display runtime for (read) access to most display features from plain web browsers.

dxmaxwell commented 4 years ago

I think that running Phoebus locally and then using an SSH tunnel to access the desired network could work well for us. Recently I became aware of the EPICS_CA_NAME_SERVERS variable, which makes that a lot easier, do you know if JCA supports this option?

Or if Phoebus could support a SOCKS proxy for CA, that would be a very clean solution. Looks like the community has done some work in that direction, but it's still a work-in-progress.

kasemir commented 4 years ago

See #988 for recently added preference setting to configure EPICS_CA_NAME_SERVERS