Final feature to finalize vector search for now - BFLOAT16 support
It's very similar to FLOAT16 with the difference that BFLOAT16 are easier to convert from/to FLOAT32 so operations are faster. But the speed comes in exchange to the precision as BFLOAT has only 8 bits for mantissa - which is pretty low (while FLOAT16 has 13 bits for mantissa)
I chose to use B16 suffix for new vector type in order to make it consistent with all other types (maybe BFLOAT16 were a bit more natural in isolation, but I think it's pretty nice to have all types follow same structure: FLOAT{suffix}(N))
Changes
Add vectorb16 conversion function
Support FLOATB16 / FB16_BLOB column types
Support compress_neighbors=floatb16 vector index parameter
Simple bruteforce benchmarks (will do more benchmarks later):
Context
Final feature to finalize vector search for now -
BFLOAT16
supportIt's very similar to
FLOAT16
with the difference thatBFLOAT16
are easier to convert from/toFLOAT32
so operations are faster. But the speed comes in exchange to the precision asBFLOAT
has only 8 bits for mantissa - which is pretty low (whileFLOAT16
has 13 bits for mantissa)I chose to use
B16
suffix for new vector type in order to make it consistent with all other types (maybeBFLOAT16
were a bit more natural in isolation, but I think it's pretty nice to have all types follow same structure:FLOAT{suffix}(N)
)Changes
vectorb16
conversion functionFLOATB16
/FB16_BLOB
column typescompress_neighbors=floatb16
vector index parameterSimple bruteforce benchmarks (will do more benchmarks later):