davidstump / SwiftPhoenixClient

Connect your Phoenix and iOS applications through WebSockets!
MIT License
506 stars 146 forks source link

Prevent crash caused by non-exclusive access to the Channels array #255

Closed ejensen closed 4 months ago

ejensen commented 4 months ago

This PR fixes a threading issue where the Socket.channels array is being mutated while it is being iterated. This PR wraps channels in a SynchronizedArray to resolve this issue.

Crash stack trace:

SIGTRAP

Crashed: com.apple.NSURLSession-delegate
0  Frame.io                   0x1779e18 specialized _ArrayProtocol.filter(_:) + 714 (Socket.swift:714)
1  Frame.io                   0x17765a0 Socket.onConnectionMessage(_:) + 713 (Socket.swift:713)
2  Frame.io                   0x1777d84 protocol witness for PhoenixTransportDelegate.onMessage(message:) in conformance Socket + 860 (Socket.swift:860)
3  Frame.io                   0x1768e34 closure #1 in URLSessionTransport.receive() + 283 (PhoenixTransport.swift:283)
4  Foundation                 0x3aadec closure #1 in NSURLSessionWebSocketTask.receive(completionHandler:) + 144
5  Foundation                 0x3aaf80 thunk for @escaping @callee_guaranteed @Sendable (@guaranteed NSURLSessionWebSocketMessage?, @guaranteed Error?) -> () + 84
6  libdispatch.dylib          0x213c _dispatch_call_block_and_release + 32
7  libdispatch.dylib          0x3dd4 _dispatch_client_callout + 20
8  libdispatch.dylib          0xb400 _dispatch_lane_serial_drain + 748
9  libdispatch.dylib          0xbf64 _dispatch_lane_invoke + 432
10 libdispatch.dylib          0x16cb4 _dispatch_root_queue_drain_deferred_wlh + 288
11 libdispatch.dylib          0x16528 _dispatch_workloop_worker_thread + 404
12 libsystem_pthread.dylib    0x1f20 _pthread_wqthread + 288
13 libsystem_pthread.dylib    0x1fc0 start_wqthread + 8
ejensen commented 1 month ago

@dsrees Would it be possible to create a 5.3.3 tag with this fix included?

dsrees commented 1 month ago

Yes, sorry. I'll try to get to that soon

dsrees commented 3 weeks ago

@ejensen Thank you for your patience. 5.3.3 should now be available