Closed bagrel closed 2 years ago
IPv4 decoding is working (at least) for me:
In [1]: from pycrate_ether.IP import *; from binascii import *
In [2]: p = IPv4()
In [3]: p.from_bytes(unhexlify('4f00007c000040004001fd307f0000017f00000186280000000101220001ae0000000000000000000000000000000000000000000000000000000001'))
In [4]: show(p)
### IPv4 ###
<vers : 4>
<hdr_wlen : 15>
<precedence : 0 (Routine)>
<delay : 0 (Normal)>
<throughput : 0 (Normal)>
<DSC : 0 (Normal)>
<ECN : 0b00>
<len : 124>
<id : 0>
<res_2 : 0b0>
<DF : 1 (do not fragment)>
<MF : 0 (last fragment)>
<frag_off : 0>
<TTL : 64>
<proto : 1 (ICMP)>
<hdr_cs : 0xfd30>
<src : 0x7f000001>
<dst : 0x7f000001>
### opt ###
### opts ###
### IPv4Option ###
<CCN : 134 (CommercialSecurity)>
<len : 40>
<val : 0x0000000101220001ae0000000000000000000000000000000000000000000000000000000001>
<pad : b''>
Please provide your full code extract, and all required information regarding your environment if you want me to help.
X = IPv4() X.from_bytes(some_byte)
when running the pseudocode above, the ‘opt’ for IPv4Options isn’t decoded.
Even when doing a reverse encoding like: `X = IPv4()
fill up X
X[‘opt’].set_val(some_IPv4OptionByteBuf) Y=IPv4() Y.from_bytes(X.to_bytes())` Y will return the same IP header info but will lose the IPv4 bytes