You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
iflen(payload) >2**31: # Too large, compression libraries often fail
This tracks back to issue ( #366 ) and PR ( #367 ). AIUI this was added to workaround a Blosc issue, which is no longer supported ( #6027 ). Though LZ4 has a similar issue as was discovered in Numcodecs ( zarr-developers/numcodecs#81 ).
As noted in comment ( #6273 (comment) ), this may be due to the use of int32 for buffer sizes in compression algorithms. Not entirely sure why that is. Though it could be a technical or practical limitation (2GB is a pretty big buffer).
It might be worth investigating whether compressors still have this limitation and if so how we want to handle it. For example if it still exists, we could break large buffers up and compress smaller chunks to workaround this issue.
The text was updated successfully, but these errors were encountered:
Currently we disallow compressing buffers of a certain size
distributed/distributed/protocol/compression.py
Line 161 in 7bd6442
This tracks back to issue ( #366 ) and PR ( #367 ). AIUI this was added to workaround a Blosc issue, which is no longer supported ( #6027 ). Though LZ4 has a similar issue as was discovered in Numcodecs ( zarr-developers/numcodecs#81 ).
As noted in comment ( #6273 (comment) ), this may be due to the use of
int32
for buffer sizes in compression algorithms. Not entirely sure why that is. Though it could be a technical or practical limitation (2GB is a pretty big buffer).It might be worth investigating whether compressors still have this limitation and if so how we want to handle it. For example if it still exists, we could break large buffers up and compress smaller chunks to workaround this issue.
The text was updated successfully, but these errors were encountered: