Closed sandlbn closed 3 years ago
The maximum NDN packet size is 8800 bytes, including all fields (name, signature, etc.)
Thanks a lot for so fast reply. Is there automatic way in the library that will allow to fragment packets? Or this logic must be implemented in producer as in consumer?
Also, would be good to provide better error handling in this case. Instead of ConnectionResetError.
Is there automatic way in the library that will allow to fragment packets? Or this logic must be implemented in producer as in consumer?
I'll leave this question to @zjkmxy
One note on terminology: we usually call segmentation the process of dividing a large piece of application content into smaller chunks (this happens at the application layer or in a library). The term fragmentation instead usually indicates the splitting of network-layer packets (e.g. NDN Data packets) into several link-layer packets (e.g. NDNLP fragments).
Also, would be good to provide better error handling in this case. Instead of ConnectionResetError.
That's possible (ndn-cxx does it). One drawback of doing so is that we'd need to hardcode a specific size limit in the library, as opposed to not caring about it in the library and letting the forwarder reject the packet if too large. Again I'll leave this to @zjkmxy
Thanks a lot for so fast reply. Is there automatic way in the library that will allow to fragment packets? Or this logic must be implemented in producer as in consumer?
No. python-ndn does not currently support fragmentation. For segmentation, you can refer to https://github.com/named-data/python-ndn/blob/9934c50671550188b04bb379825a5c4f2e4181c2/examples/putchunks.py#L40-L48
Also, would be good to provide better error handling in this case. Instead of ConnectionResetError.
Agree. I never tried to send large packets so I didn't know there is such error. I will wrap it up.
I tried to implement producer and serve the image. However, when the image exceeds 9000 bytes, I'm getting the below error.
Is it a limit for content size? Should I serve content in smaller chunks?