Closed one-d-wide closed 7 months ago
Thanks! This looks good to me - there is just one thing:
This breaks the existing example - it crashes with
[2024-03-05T07:37:14Z DEBUG bleps::l2cap] L2CAP [03, 00, 04, 00, 0a, 08, 00]
[2024-03-05T07:37:14Z TRACE bleps::attribute_server] att: ReadReq { handle: 8 }
[2024-03-05T07:37:14Z DEBUG bleps::attribute_server] src_handle 1
[2024-03-05T07:37:14Z DEBUG bleps::attribute_server] data [b, 0, 0]
[2024-03-05T07:37:14Z TRACE bleps::attribute_server] encoded_l2cap [3, 0, 4, 0, b, 0, 0]
[2024-03-05T07:37:14Z TRACE bleps::attribute_server] writing [2, 1, 20, 7, 0, 3, 0, 4, 0, b, 0, 0]
[2024-03-05T07:37:14Z TRACE bleps::attribute_server] polled: Some(Event(NumberOfCompletedPackets { number_of_connection_handles: 1, connection_handles: 1, completed_packets: 1 }))
[2024-03-05T07:37:15Z WARN bleps::event] Ignoring unknown le-meta event 03 data = [00, 01, 00, 30, 00, 00, 00, c0, 03]
[2024-03-05T07:37:15Z TRACE bleps::attribute_server] polled: Some(Event(Unknown))
[2024-03-05T07:37:17Z DEBUG bleps::acl] raw handle 00000001 00100000 - boundary FirstAutoFlushable
[2024-03-05T07:37:17Z DEBUG bleps::acl] read len 9
[2024-03-05T07:37:17Z DEBUG bleps] Wanted = 5, actual = 9
[2024-03-05T07:37:17Z TRACE bleps::attribute_server] polled: Some(AsyncData(AclPacket { handle: 1, boundary_flag: FirstAutoFlushable, bc_flag: PointToPoint, data: [5, 0, 4, 0, 12, 8, 0, 1, 0] }))
[2024-03-05T07:37:17Z DEBUG bleps::l2cap] L2CAP [05, 00, 04, 00, 12, 08, 00, 01, 00]
[2024-03-05T07:37:17Z TRACE bleps::attribute_server] att: WriteReq { handle: 8, data: [1, 0] }
[2024-03-05T07:37:17Z DEBUG bleps::attribute_server] src_handle 1
[2024-03-05T07:37:17Z DEBUG bleps::attribute_server] data [13]
[2024-03-05T07:37:17Z TRACE bleps::attribute_server] encoded_l2cap [1, 0, 4, 0, 13]
[2024-03-05T07:37:17Z TRACE bleps::attribute_server] writing [2, 1, 20, 5, 0, 1, 0, 4, 0, 13]
[2024-03-05T07:37:17Z TRACE bleps::attribute_server] polled: Some(Event(NumberOfCompletedPackets { number_of_connection_handles: 1, connection_handles: 1, completed_packets: 1 }))
notify if enabled
thread 'main' panicked at C:\projects\review\one-d-wide\bleps\bleps\src\attribute_server.rs:86:1:
index out of bounds: the len is 8 but the index is 8
stack backtrace:
0: 0x7ff76d798e32 - <unknown>
1: 0x7ff76d7a848b - <unknown>
2: 0x7ff76d796b61 - <unknown>
3: 0x7ff76d798c5a - <unknown>
4: 0x7ff76d79ac8f - <unknown>
5: 0x7ff76d79a93b - <unknown>
6: 0x7ff76d79b1b4 - <unknown>
7: 0x7ff76d79b089 - <unknown>
8: 0x7ff76d7994c9 - <unknown>
9: 0x7ff76d79ad46 - <unknown>
10: 0x7ff76d7b51c7 - <unknown>
11: 0x7ff76d7b5394 - <unknown>
12: 0x7ff76d69851c - <unknown>
13: 0x7ff76d695ec8 - <unknown>
14: 0x7ff76d699796 - <unknown>
15: 0x7ff76d6997ac - <unknown>
16: 0x7ff76d792e32 - <unknown>
17: 0x7ff76d696a8c - <unknown>
18: 0x7ff76d7ade40 - <unknown>
19: 0x7ffe0e8f257d - BaseThreadInitThunk
20: 0x7ffe0f1caa58 - RtlUserThreadStart
I guess it happens in get_characteristic_value
in AttributeServer.rs
. Most likely the value of my_characteristic_handle
is now correct while it wasn't correct before. Looking at that function it seems it assumes that the handle matches the index which isn't true. So, this change uncovered an existing problem. Probably self.attributes[(handle as usize) - 1]
would be good enough
Fixed it. The example runs normal now.
Added
Fixed
bleps
's attribute server assumed that the only first most descriptor of characteristic could be "Client Characteristic Configuration" Descriptor. Which wasn't consistent with the gatt macro.AttributeServer::get_characteristic_value