npm i @planet-a/avsc-zstandard-codecor
yarn add @planet-a/avsc-zstandard-codecimport Avro from "avsc";
import {
createDecoderMixin,
createEncoderMixin,
codecName,
} from "@planet-a/avsc-zstandard-codec";
const mySchema = Avro.Type.forSchema({ type: "string" });
{
// encode
const fileEncoder = Avro.createFileEncoder("./my.avro", mySchema, {
codec: codecName,
codecs: {
...Avro.streams.BlockEncoder.defaultCodecs(),
...createEncoderMixin(),
},
})
.write("Hello")
.write("World")
.end();
await finished(fileEncoder);
}
{
// decode
const fileDecoder = Avro.createFileDecoder("./my.avro", {
codecs: {
...Avro.streams.BlockEncoder.defaultCodecs(),
...createDecoderMixin(),
},
}).on("data", console.log.bind(console));
await finished(fileDecoder);
}It uses the @mongodb-js/zstd package, as this package has a few advantages:
- The
decompressfunction does not need the uncompressed buffer size in advance, a restriction which most other WASM-based implementations have and renders them unusable for this task - It works with
Buffer. Whilst aUint8Arrayimplementation would be more portable (I am looking at you, Deno),[email protected]itself is usingBuffer. mtth/avsc#452 has landed, so we might have some more options of what packages to use once we drop[email protected]support.
You'll see that the current implementation uses defaults from the Avro repository.
Namely:
- the codec name (if you don't adhere to
zstandardthe file won't be readable at all) - whether to use a checksum or not (with checksum, the metadata will be readable, but the data will yield an error (
Could not read file)).
The reason for that is, that in order to make the Avro export as portable as possible, we need to make sure that none of these things need to be specified. A prime example of that is for example Snowflake's Avro support (COPY INTO). Specifically, if you alter the codec name and/or the checksum flag, you won't be able to use the generated Avro files via their product.