Closed dingluo closed 9 months ago
Are all these connect
messages other clients that you have? Or is the one client disconnecting and reconnecting? Any errors or stack traces in the log?
This is a single client disconnecting and reconnecting. If sending string/dict instead of bytes, when the broken message is received, the server raises an error: "json.decoder.JSONDecodeError: Unterminated string starting at: line1 column 10 (char9)". Otherwise there are no errors from either the server or client.
I cannot reproduce your error here. On my system all the packets arrive with a 40000 size payload. What OS is this? I'm doing it on a Mac.
I tested on both a pc with Windows 7 and another pc with Ubuntu and saw similar behavior. Both systems are running python 3.6.
I think this is probably a limitation of the network stack. You are sending 40000 x 20 = 800000 bytes all at about the same time. When the connection cannot cope with the traffic it just closes, and that's when you see a partial packet and a reconnection right after.
Sorry I did not make myself clear when I say there's a single client disconnecting and reconnecting. There's no "automatic" reconnection. I was manually restarting the client.
Anyways, I think you could correct. After adding a pause of 1s between each message, all 20 messages can be received, although some are still cropped. I'm not entirely convinced because I used quite recent pc hardware and both server and client are running on the same system through localhost. Can you stress your mac by increasing the size the of message? If it's the problem with the network stack, you should be able to reach the limit somewhere with your mac as well, right? And in that case, is there a way to find out what is exactly the limit? e.g. max size per message, max throughput per second..
I made the same test on a Macbook today and got the same problem (dropping messages after cropping one message). With several rounds of restarting of the client (always sending 20 messages in a row), the server raises an error:
Traceback (most recent call last):
File "/Users/Ding/anaconda3/lib/python3.7/site-packages/engineio/asyncio_server.py", line 264, in handle_request
await socket.handle_post_request(environ)
File "/Users/Ding/anaconda3/lib/python3.7/site-packages/engineio/asyncio_socket.py", line 106, in handle_post_request
p = payload.Payload(encoded_payload=body)
File "/Users/Ding/anaconda3/lib/python3.7/site-packages/engineio/payload.py", line 15, in __init__
self.decode(encoded_payload)
File "/Users/Ding/anaconda3/lib/python3.7/site-packages/engineio/payload.py", line 61, in decode
raise ValueError('Too many packets in payload')
ValueError: Too many packets in payload
After that, the server fails multiple times without raising any error, until the next time it raises the same error.
This error is new, it was released yesterday. My guess is that your previous tests on Windows and Ubuntu were using an older release that didn't have this check in place.
What I've also noticed is that this problem happens only for an async server (with aiohttp). With a synchonous server (e.g. eventlet), messages are not dropped.
same problem with aiohttp some times have that error
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/engineio/asyncio_server.py", line 420, in _trigger_event
ret = await self.handlers[event](*args)
File "/usr/local/lib/python3.7/dist-packages/socketio/asyncio_server.py", line 534, in _handle_eio_message
pkt = packet.Packet(encoded_packet=data)
File "/usr/local/lib/python3.7/dist-packages/socketio/packet.py", line 43, in __init__
self.attachment_count = self.decode(encoded_packet)
File "/usr/local/lib/python3.7/dist-packages/socketio/packet.py", line 113, in decode
self.data = self.json.loads(ep)
File "/usr/lib/python3.7/json/__init__.py", line 348, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.7/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.7/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 700 (char 699)
I don't know if this has any connection, but the times between my sending and receiving are extremely wonky. Here is my server:
from aiohttp import web
import socketio
sio = socketio.AsyncServer(async_mode='aiohttp', async_handlers=True)
app = web.Application()
sio.attach(app)
@sio.event
def connect(sid, environ):
print('connect ', sid)
@sio.on("msg")
async def my_message(sid, data):
print(my_message, data)
running = True
while running:
message = input("Message: ")
if message == "quit":
running = False
await sio.emit("msg", {"data":message})
@sio.event
def disconnect(sid):
print('disconnect ', sid)
if __name__ == '__main__':
web.run_app(app, host="localhost", port=5000)
Client:
from aiohttp import web
import socketio
sio = socketio.AsyncServer(async_mode='aiohttp', async_handlers=True)
app = web.Application()
sio.attach(app)
@sio.event
def connect(sid, environ):
print('connect ', sid)
@sio.on("msg")
async def my_message(sid, data):
print(my_message, data)
running = True
while running:
message = input("Message: ")
if message == "quit":
running = False
await sio.emit("msg", {"data":message})
@sio.event
def disconnect(sid):
print('disconnect ', sid)
if __name__ == '__main__':
web.run_app(app, host="localhost", port=5000)
When I send "1" as the message on the server, the client receives nothing. Then I send "2", and the client receives "1". Then I send "3", and the client receives "2". Other-times it would follow different patterns. I think this problem may have something to do with emitting in loops. Also I also don't have this loop when adding delay between the emits.
@simplyrohan you can't use input()
in an async program. That is a blocking function.
Ok, sorry, mistake on my end.
Closing this, since there hasn't been any activity in several years.
The message sent from the client to an async randomly drops (quietly if the message is bytes) when the message gets big.
The server prints the length of the message:
The client sends 20 messages consecutively with the same message (length of 40000):
When running the client multiple times, the server is able to receive a varying number of messages, with the last received message cropped. If the message goes through json (e.g. sending string instead of the bytes), server would raise an error with json saying end of string is missing (understandable as the latter part is cropped). Otherwise it would fail silently. All messages after this broken message will not be received anymore.
Below is the output from the server:
The problem gradually gets worse the the message size increases and it is gone when the size of the message is small enough. For example, with a message of 1000 bytes, the server would print 20 times "message 1000".
I experience this behaviour on both a pc with windows 7 and a pc with ubuntu. I also made a similar test with zeromq and this problem does not happen. I'm quite new to socketio, am I missing something?