macielbombonato / bigbluebutton

Automatically exported from code.google.com/p/bigbluebutton
0 stars 0 forks source link

java.lang.OutOfMemoryError: GC overhead limit exceeded after server crashed during desktop sharing #910

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
The following error appears on red5.log and error.log when the server crashed 
during the voice stress testing.

The server halted when Fred started desktop sharing (full screen).

The log below appears when red5 is disconnecting because it can't ping the 
client anymore.

2011-03-23 20:08:39,647 [Red5_Scheduler_Worker-8] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 45804 to test.blindsidenetworks.com (in: 6727 out 6944 ), with 
id 2042255314, due to too much inactivity (60839ms), last ping sent 1000ms ago
2011-03-23 20:08:39,647 [Red5_Scheduler_Worker-8] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:39,862 [Red5_Scheduler_Worker-11] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 36565 to test.blindsidenetworks.com (in: 6166 out 6384 ), with 
id 115465687, due to too much inactivity (60789ms), last ping sent 1000ms ago
2011-03-23 20:08:39,862 [Red5_Scheduler_Worker-11] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:40,220 [Red5_Scheduler_Worker-8] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 45809 to test.blindsidenetworks.com (in: 6620 out 6839 ), with 
id 427294246, due to too much inactivity (60839ms), last ping sent 999ms ago
2011-03-23 20:08:40,220 [Red5_Scheduler_Worker-8] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:40,295 [Red5_Scheduler_Worker-7] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 45812 to test.blindsidenetworks.com (in: 7156 out 453262 ), 
with id 892588822, due to too much inactivity (60801ms), last ping sent 1001ms 
ago
2011-03-23 20:08:40,295 [Red5_Scheduler_Worker-7] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:40,579 [Red5_Scheduler_Worker-12] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 36567 to test.blindsidenetworks.com (in: 6313 out 6484 ), with 
id 541485771, due to too much inactivity (60819ms), last ping sent 998ms ago
2011-03-23 20:08:40,579 [Red5_Scheduler_Worker-12] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:40,829 [Red5_Scheduler_Worker-5] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 36566 to test.blindsidenetworks.com (in: 7837 out 171061 ), 
with id 1880546796, due to too much inactivity (60820ms), last ping sent 1000ms 
ago
2011-03-23 20:08:40,829 [Red5_Scheduler_Worker-5] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:40,939 [Red5_Scheduler_Worker-4] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 45810 to test.blindsidenetworks.com (in: 8761 out 184691 ), 
with id 640016254, due to too much inactivity (60836ms), last ping sent 1004ms 
ago
2011-03-23 20:08:40,939 [Red5_Scheduler_Worker-4] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:40,964 [Red5_Scheduler_Worker-11] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 45811 to test.blindsidenetworks.com (in: 6785 out 6953 ), with 
id 1269496319, due to too much inactivity (60838ms), last ping sent 1005ms ago
2011-03-23 20:08:40,964 [Red5_Scheduler_Worker-11] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:41,127 [Red5_Scheduler_Worker-10] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 45806 to test.blindsidenetworks.com (in: 6892 out 7058 ), with 
id 1417891761, due to too much inactivity (60847ms), last ping sent 1000ms ago
2011-03-23 20:08:41,127 [Red5_Scheduler_Worker-10] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:41,207 [Red5_Scheduler_Worker-4] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 45807 to test.blindsidenetworks.com (in: 1417979 out 1359342 ), 
with id 1950104233, due to too much inactivity (60851ms), last ping sent 999ms 
ago
2011-03-23 20:08:41,208 [Red5_Scheduler_Worker-4] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:48,423 [Red5_Scheduler_Worker-3] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 36568 to test.blindsidenetworks.com (in: 6388 out 6976 ), with 
id 1648090122, due to too much inactivity (60843ms), last ping sent 1000ms ago
2011-03-23 20:08:48,423 [Red5_Scheduler_Worker-3] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!
2011-03-23 20:08:48,462 [Red5_Scheduler_Worker-11] WARN  
o.r.server.net.rtmp.RTMPConnection - Closing RTMPMinaConnection from 
80.218.18.180 : 36569 to test.blindsidenetworks.com (in: 6648 out 452821 ), 
with id 1374951357, due to too much inactivity (60838ms), last ping sent 1000ms 
ago
2011-03-23 20:08:48,462 [Red5_Scheduler_Worker-11] WARN  
o.r.server.net.rtmp.RTMPConnection - This often happens if YOUR Red5 
application generated an exception on start-up. Check earlier in the log for 
that exception first!

2011-03-23 20:12:52,405 [http-8088-ClientPoller-6] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: Java heap space
2011-03-23 20:12:52,405 [http-8088-ClientPoller-4] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:12:52,405 [http-8088-ClientPoller-0] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:12:52,406 [http-8088-exec-3] ERROR 
o.a.c.c.C.[.[0.0.0.0].[/].[rtmpt] - Servlet.service() for servlet rtmpt threw 
exception
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:12:52,406 [http-8088-exec-1] ERROR 
o.a.c.c.C.[.[0.0.0.0].[/].[rtmpt] - Servlet.service() for servlet rtmpt threw 
exception
java.lang.OutOfMemoryError: Java heap space
2011-03-23 20:12:52,407 [http-8088-exec-4] ERROR 
o.a.c.c.C.[.[0.0.0.0].[/].[rtmpt] - Servlet.service() for servlet rtmpt threw 
exception
java.lang.OutOfMemoryError: GC overhead limit exceeded
        at java.nio.HeapByteBuffer.<init>(HeapByteBuffer.java:57) [na:1.6.0_20]
        at java.nio.ByteBuffer.allocate(ByteBuffer.java:329) [na:1.6.0_20]
        at org.apache.mina.core.buffer.SimpleBufferAllocator.allocateNioBuffer(SimpleBufferAllocator.java:44) [mina-core-2.0.0-RC1.jar:na]
        at org.apache.mina.core.buffer.AbstractIoBuffer.capacity(AbstractIoBuffer.java:185) [mina-core-2.0.0-RC1.jar:na]
        at org.apache.mina.core.buffer.AbstractIoBuffer.expand(AbstractIoBuffer.java:289) [mina-core-2.0.0-RC1.jar:na]
        at org.apache.mina.core.buffer.AbstractIoBuffer.expand(AbstractIoBuffer.java:263) [mina-core-2.0.0-RC1.jar:na]
        at org.apache.mina.core.buffer.AbstractIoBuffer.autoExpand(AbstractIoBuffer.java:2530) [mina-core-2.0.0-RC1.jar:na]
        at org.apache.mina.core.buffer.AbstractIoBuffer.put(AbstractIoBuffer.java:553) [mina-core-2.0.0-RC1.jar:na]
        at org.apache.mina.core.buffer.AbstractIoBuffer.put(AbstractIoBuffer.java:1099) [mina-core-2.0.0-RC1.jar:na]
        at org.red5.server.net.rtmpt.BaseRTMPTConnection.foldPendingMessages(BaseRTMPTConnection.java:275) [red5.jar:na]
        at org.red5.server.net.rtmpt.RTMPTConnection.getPendingMessages(RTMPTConnection.java:165) [red5.jar:na]
        at org.red5.server.net.rtmpt.RTMPTServlet.returnPendingMessages(RTMPTServlet.java:255) [red5.jar:na]
        at org.red5.server.net.rtmpt.RTMPTServlet.handleIdle(RTMPTServlet.java:418) [red5.jar:na]
        at org.red5.server.net.rtmpt.RTMPTServlet.service(RTMPTServlet.java:460) [red5.jar:na]
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) [javaee-api-5.1.1.jar:5.1.1]
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) [catalina-6.0.24.jar:na]
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [catalina-6.0.24.jar:na]
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233) [catalina-6.0.24.jar:na]
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) [catalina-6.0.24.jar:na]
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:465) [catalina-6.0.24.jar:na]
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) [catalina-6.0.24.jar:na]
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) [catalina-6.0.24.jar:na]
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) [catalina-6.0.24.jar:na]
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) [catalina-6.0.24.jar:na]
        at org.apache.coyote.http11.Http11NioProcessor.process(Http11NioProcessor.java:883) [tomcat-coyote-6.0.24.jar:na]
        at org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.process(Http11NioProtocol.java:721) [tomcat-coyote-6.0.24.jar:na]
        at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:2258) [tomcat-coyote-6.0.24.jar:na]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) [na:1.6.0_20]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) [na:1.6.0_20]
        at java.lang.Thread.run(Thread.java:636) [na:1.6.0_20]
2011-03-23 20:12:53,598 [http-8088-exec-5] INFO  
o.r.s.stream.ClientBroadcastStream - Close
2011-03-23 20:12:53,598 [http-8088-exec-5] INFO  
o.r.s.stream.ClientBroadcastStream - Provider disconnect
2011-03-23 20:12:53,598 [http-8088-exec-5] INFO  
o.r.s.stream.ClientBroadcastStream - Provider disconnect
2011-03-23 20:13:12,861 [NioProcessor-1] WARN  o.a.m.c.f.DefaultIoFilterChain - 
Unexpected exception from exceptionCaught handler.
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:13:28,149 [http-8088-exec-1] ERROR 
o.a.c.c.C.[.[0.0.0.0].[/].[rtmpt] - Servlet.service() for servlet rtmpt threw 
exception
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:13:29,948 [http-8088-ClientPoller-7] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:13:31,527 [http-8088-exec-5] INFO  
o.r.s.stream.ClientBroadcastStream - Close
2011-03-23 20:13:31,527 [http-8088-exec-5] INFO  
o.r.s.stream.ClientBroadcastStream - Close
2011-03-23 20:13:32,840 [http-8088-exec-4] ERROR 
o.a.c.c.C.[.[0.0.0.0].[/].[rtmpt] - Servlet.service() for servlet rtmpt threw 
exception
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:13:40,822 [http-8088-exec-3] ERROR 
o.a.c.c.C.[.[0.0.0.0].[/].[rtmpt] - Servlet.service() for servlet rtmpt threw 
exception
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:13:44,846 [http-8088-ClientPoller-6] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:13:57,504 [http-8088-ClientPoller-2] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: Java heap space
2011-03-23 20:13:57,504 [http-8088-ClientPoller-4] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: Java heap space
2011-03-23 20:13:57,504 [http-8088-ClientPoller-7] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:13:57,504 [http-8088-ClientPoller-0] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: Java heap space
2011-03-23 20:13:56,784 [http-8088-ClientPoller-3] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:14:05,111 [pool-70214-thread-1] ERROR 
o.r.s.m.InMemoryPushPushPipe - Exception when pushing message to consumer
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:14:18,540 [http-8088-exec-3] INFO  
o.red5.server.net.rtmp.RTMPHandler - Connecting to: [FAILED toString()]
2011-03-23 20:14:23,158 [http-8088-exec-4] ERROR 
o.a.c.c.C.[.[0.0.0.0].[/].[rtmpt] - Servlet.service() for servlet rtmpt threw 
exception
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:14:20,860 [NioProcessor-10] WARN  o.a.m.c.f.DefaultIoFilterChain 
- Unexpected exception from exceptionCaught handler.
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:14:28,630 [NioBlockingSelector.BlockPoller-1] ERROR 
o.a.t.util.net.NioBlockingSelector -
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-03-23 20:14:34,264 [http-8088-ClientPoller-5] ERROR 
o.a.tomcat.util.net.NioEndpoint -
java.lang.OutOfMemoryError: Java heap space

Original issue reported on code.google.com by ritza...@gmail.com on 24 Mar 2011 at 1:40

GoogleCodeExporter commented 9 years ago
We are seeing the same issue.

Server: Ubuntu 10.04.2 LTS

bigbluebutton: 0.70

The issue cropped up when we had enabled desktop sharing in a session with 9 
users all sharing webcams.  Specifically, it was as we began using the tools to 
draw on a presentation on the shared desktop.

The java process running red5 spiked to 100% of a CPU core and we saw the 
following in the red5 log:

2011-05-26 10:10:57,360 [NioProcessor-1] INFO  
o.r.s.stream.ClientBroadcastStream - Consumer connect
2011-05-26 10:10:57,488 [NioProcessor-1] INFO  
o.red5.server.net.rtmp.RTMPHandler - Remembering client buffer on stream: 0
2011-05-26 10:10:57,490 [NioProcessor-1] INFO  
o.r.s.stream.ClientBroadcastStream - Consumer connect
2011-05-26 10:14:59,664 [NioProcessor-1] INFO  
o.red5.server.net.rtmp.RTMPHandler - Remembering client buffer on stream: 0
2011-05-26 10:14:59,712 [NioProcessor-1] INFO  
o.red5.server.net.rtmp.RTMPHandler - Remembering client buffer on stream: 0
2011-05-26 10:14:59,765 [NioProcessor-1] INFO  
o.red5.server.net.rtmp.RTMPHandler - Remembering client buffer on stream: 0
2011-05-26 10:14:59,900 [NioProcessor-1] INFO  
o.red5.server.net.rtmp.RTMPHandler - Remembering client buffer on stream: 0
2011-05-26 10:17:26,919 [http-8088-ClientPoller-3] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-05-26 10:17:26,919 [http-8088-ClientPoller-2] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: GC overhead limit exceeded
2011-05-26 10:17:26,919 [http-8088-ClientPoller-0] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:26,923 [NioProcessor-1] ERROR 
o.r.s.n.r.codec.RTMPProtocolDecoder - Last header null not new, headerSize: 2, 
channelId 47
2011-05-26 10:17:26,924 [NioProcessor-1] WARN  
o.r.s.n.r.codec.RTMPProtocolDecoder - Unknown object type: 0
2011-05-26 10:17:27,923 [http-8088-ClientPoller-1] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:28,807 [http-8088-ClientPoller-2] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:29,247 [http-8088-ClientPoller-3] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:29,689 [http-8088-ClientPoller-0] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:29,689 [http-8088-ClientPoller-1] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:30,573 [http-8088-ClientPoller-2] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:33,263 [http-8088-ClientPoller-0] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:33,263 [http-8088-ClientPoller-2] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:33,263 [http-8088-ClientPoller-1] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:33,264 [http-8088-ClientPoller-3] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:34,244 [NioProcessor-1] ERROR 
o.r.s.n.r.codec.RTMPProtocolDecoder - Last header null not new, headerSize: 3, 
channelId 24
2011-05-26 10:17:34,245 [NioProcessor-1] WARN  
o.r.s.n.r.codec.RTMPProtocolDecoder - Unknown object type: 0
2011-05-26 10:17:36,466 [http-8088-ClientPoller-0] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:36,466 [http-8088-ClientPoller-2] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:36,467 [http-8088-ClientPoller-1] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:36,467 [http-8088-ClientPoller-3] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space
2011-05-26 10:17:36,909 [NioProcessor-1] ERROR 
o.r.s.n.r.codec.RTMPProtocolDecoder - Last header null not new, headerSize: 2, 
channelId 17
2011-05-26 10:17:36,910 [NioProcessor-1] WARN  
o.r.s.n.r.codec.RTMPProtocolDecoder - Unknown object type: 0
2011-05-26 10:17:36,910 [NioProcessor-1] ERROR 
o.r.s.n.r.codec.RTMPProtocolDecoder - Last header null not new, headerSize: 2, 
channelId 23
2011-05-26 10:17:36,910 [NioProcessor-1] WARN  
o.r.s.n.r.codec.RTMPProtocolDecoder - Unknown object type: 0
2011-05-26 10:17:36,910 [NioProcessor-1] ERROR 
o.r.s.n.r.codec.RTMPProtocolDecoder - Last header null not new, headerSize: 3, 
channelId 56
2011-05-26 10:17:36,910 [NioProcessor-1] WARN  
o.r.s.n.r.codec.RTMPProtocolDecoder - Unknown object type: 0
2011-05-26 10:17:36,910 [NioProcessor-1] ERROR 
o.r.s.n.r.codec.RTMPProtocolDecoder - Last header null not new, headerSize: 2, 
channelId 25
2011-05-26 10:17:36,910 [NioProcessor-1] WARN  
o.r.s.n.r.codec.RTMPProtocolDecoder - Unknown object type: 0
2011-05-26 10:17:36,910 [NioProcessor-1] ERROR 
o.r.s.n.r.codec.RTMPProtocolDecoder - Last header null not new, headerSize: 1, 
channelId 54
2011-05-26 10:17:38,721 [http-8088-ClientPoller-1] ERROR 
o.a.tomcat.util.net.NioEndpoint - 
java.lang.OutOfMemoryError: Java heap space

errors continue and get progressively worse.

We had to restart the red5 server.  Then had to restart the meeting.  Existing 
users in the meeting that had not tried reconnecting were able to come back on.

Original comment by hollan...@hollandco.com on 26 May 2011 at 4:30

GoogleCodeExporter commented 9 years ago

Original comment by ffdixon@gmail.com on 13 Sep 2011 at 3:27

GoogleCodeExporter commented 9 years ago
This happened again during a desktop sharing session with 25 people.  Looking in

  /usr/share/red5/log/deskshare-slf.2011-09-09.log

we see the

  java.lang.OutOfMemoryError: GC overhead limit exceeded

message.  Here's an excerpt from the log

2011-09-09 16:02:14,503 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting DecoderConfiguration
2011-09-09 16:02:14,504 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting keyFrame
2011-09-09 16:02:20,410 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting DecoderConfiguration
2011-09-09 16:02:20,410 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting keyFrame
2011-09-09 16:02:20,750 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting DecoderConfiguration
2011-09-09 16:02:20,750 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting keyFrame
2011-09-09 16:02:23,725 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting DecoderConfiguration
2011-09-09 16:02:23,725 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting keyFrame
2011-09-09 16:02:59,267 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting DecoderConfiguration
2011-09-09 16:02:59,267 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting keyFrame
2011-09-09 16:03:29,046 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting DecoderConfiguration
2011-09-09 16:03:29,046 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting keyFrame
2011-09-09 16:03:30,817 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting DecoderConfiguration
2011-09-09 16:03:30,817 [NioProcessor-1] DEBUG o.b.deskshare.server.ScreenVideo 
- getting keyFrame
2011-09-09 16:03:40,502 [NioProcessor-7] WARN  
o.b.d.s.s.BlockStreamEventMessageHandler - java.lang.OutOfMemoryError: GC 
overhead limit exceeded
 GC overhead limit exceeded
2011-09-09 16:03:42,950 [NioProcessor-6] WARN  
o.b.d.s.s.BlockStreamEventMessageHandler - java.lang.OutOfMemoryError: Java 
heap space
 Java heap space
2011-09-09 16:04:39,359 [NioProcessor-6] WARN  
o.b.d.s.s.BlockStreamEventMessageHandler - java.lang.OutOfMemoryError: GC 
overhead limit exceeded
 GC overhead limit exceeded
2011-09-09 16:05:12,282 [NioProcessor-6] DEBUG 
o.b.d.s.s.BlockStreamEventMessageHandler - IDLE 1
2011-09-09 16:05:12,527 [NioProcessor-6] WARN  
o.b.d.s.s.BlockStreamEventMessageHandler - java.lang.OutOfMemoryError: Java 
heap space
 Java heap space
2011-09-09 16:05:13,058 [NioProcessor-6] WARN  
o.b.d.s.s.BlockStreamEventMessageHandler - java.lang.OutOfMemoryError: GC 
overhead limit exceeded
 GC overhead limit exceeded
2011-09-09 16:14:01,168 [main] DEBUG ROOT - Starting up context deskshare
2011-09-09 16:22:38,030 [NioProcessor-9] DEBUG 
o.b.d.s.s.BlockStreamEventMessageHandler - Session Created
2011-09-09 16:22:38,030 [NioProcessor-9] DEBUG 
o.b.d.s.s.BlockStreamEventMessageHandler - Session Opened.
2011-09-09 16:22:38,030 [NioProcessor-9] DEBUG 
o.b.d.s.s.BlockStreamEventMessageHandler - Session Closed.
2011-09-09 16:22:38,030 [NioProcessor-9] WARN  
o.b.d.s.s.BlockStreamEventMessageHandler - Closing session for a NULL room
2011-09-09 16:22:40,240 [RMI TCP Connection(5)-76.74.239.202] DEBUG ROOT - 
Shutting down context deskshare
2011-09-09 16:24:23,289 [main] DEBUG ROOT - Starting up context deskshare
2011-09-09 16:31:22,691 [NioProcessor-7] DEBUG 
o.b.d.s.s.BlockStreamEventMessageHandler - Session Created
2011-09-09 16:31:22,691 [NioProcessor-7] DEBUG 
o.b.d.s.s.BlockStreamEventMessageHandler - Session Opened.
2011-09-09 16:31:22,868 [NioProcessor-8] DEBUG 
o.b.d.s.s.BlockStreamEventMessageHandler - Session Created

Original comment by ffdixon@gmail.com on 13 Sep 2011 at 3:33

GoogleCodeExporter commented 9 years ago

Original comment by ffdixon@gmail.com on 22 Nov 2011 at 2:16