Closed QuRyu closed 5 years ago
I have not committed the code above to my GitHub because there are bugs contained. But if anyone needs a reference I am happy to do so.
realized this is actually an issue with my conduit composition, not a problem of the binary library per se.
Thanks for looking into this!
I am streaming a file of pcap format with
conduit
library, combined with Decoder to parse data. My parser is defined as the followingparseB6034
builds upon parsePPacket to read a specific kind of data. TheunsafePerformIO
and IORef parts are intended for debugging. I have aparseGHeader
function that is used to parse the first chunk of the file and it runs perfectly ok. So I have not pasted it here.The place where I use Decoder is in the following code:
conduitParseMany
turns a Get monad into a Decoder and run it until there is no more input from the upstream. There is also a functionconduitParseOnce
, which also does the conversion but only runs the Decoder once and return (not yield) the output. Then I combine these two functions and the two parsers above to parse the file.The pipeline function returns the data given the filepath.
My parser functions worked fine with
readFile
from Data.ByteString.Lazy. But in the incremental processing code above, it reports "not enough bytes" after having parsed exactly 115 packets, even though there are bytes left unconsumed. Specifically, when the program runs the linecontent <- getByteString 209
, it reports the error.My speculation is that the Decoder is trying to get a ByteString of length 209 at once, but the Conduit provides a chunk of ByteString of length less than 209. Therefore, the Decoder just reports the error. Is this a correct guess though?