Replies: 1 comment 3 replies
-
It sounds like you are dealing with a file format made of consecutive Let's then split up the data:
I recommend this model of data processing because
You can then skip ahead to future chunks without reading in all Now, you mentioned If there is a data model for |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm developing a parser for a binary format using winnow that has a layout kind of like:
The
data
chunks in some scenarios can be extremely large -- like 1.5gb. I'm developing this parser in a sans-io manner with aLocatingSlice<Partial<&[u8]>>
stream so I can facilitate both wasm and desktop environments. On desktop these large chunks are mostly no biggie because I'm just going tommap
the file and call it a day.In my parser I'd like to be able to essentially
skip()
over these bytes without reading them. I'm aware that I cantake(N).void().parse_next(input)
, but I'm weakly confident that this would thrashing thePartial<&[u8]>
buffer and cause the upper state machine to think it needs to fill a buffer withN
bytes for reading, when in reality I just need to simulate a seek.Is there a recommended way to handle this type of scenario?
It's worth noting:
Beta Was this translation helpful? Give feedback.
All reactions