Closed onesuper closed 2 years ago
A backwards jump (where the destination offset is less than the current offset) would indicate that bytes that have already been decoded should be decoded again, and for a different purpose. As a security precaution and sanity check, I have not been allowing this.
Could you contact the creator of the data to determine if they really intend to do this and if so, why?
Specifically, they are reusing the same offset value (2848) for two distinct items (index 7 and index 8). I assume they are doing it intentionally (to save gas?), but it is best to be sure.
I am glad you reply. I will contact the author to figure out what's the intention.
Should we consider relaxing the restriction only if the outcome is right? But I'm not sure whether it is a good choice from the API perspective. Maybe this case (reusing the same offset) is not very common.
We have some customers who are facing this issue. It begs the question: How can I patch the code to avoid the error? Maybe I can craft a patch to work around the problem temporarily.
I added a commit which should allow any number of elements to share the same offset, but which will not allow backwards jumps in other circumstances: https://github.com/esaulpaugh/headlong/commit/e042a122499dec0a040f64cbd3780112a4e17e21
If you are able to test this change on the full dataset, please do.
In any case, I will most likely release a new version with this change included soon.
Cool! I will test it right now (I will come back if I get testing results). The patch looks great to me.
Hi @esaulpaugh,
Thanks a lot for having done such a great job.
We use headlong as our core decoding facilitates at our open-source analytics library: https://github.com/datawaves-xyz/blockchain-spark. It is super fast, well maintained, and suits our use case.
Recently, we found some decoding errors from our server log. It almost failed on half of OpenSea's 'atomicMatch' function calls (address:
0x7f268357a8c2552623316e2562d90e642bb538e5
). I try to debug the program but hard to fix it by my self. So I am asking you for help.Here's the stack trace:
I retrieve the data from our server. I decode it with Web3.py on my laptop then it works. The following code snippet is for reproducing the problem locally.
To get you to the site faster, the error happens when parsing the 8th element of the tuple:
Thanks again!
Yichao