Vast volumes of scientific data cannot be stored and transferred efficiently because of limited I/O bandwidth, network bandwidth, and storage capacity. Error-bounded lossy compressors are a promising solution to reducing scientific data volumes while also addressing user data fidelity requirements. For example, SZ, ZFP and MGARD allow users to set an absolute error bound when performing lossy compression such that the difference between the original data and reconstructed data is bounded by that threshold. Climate scientists have verified that the reconstructed data generated by error-bounded lossy compressors are acceptable for post hoc analysis.
The SZ3 compression can be separated into four stages: prediction, quantization, huffman-coding, and lossless copmression. We focus more on the first two stages to improve the performance and satisfy a diversified user requirements. We designed several prediction algorithms including Lorenzo, Linear Regression, Interpolation. The performance of the predictor is highly dependent on the data patterns, and thus we also designed some methods to choose the most suitable predictor based on data sampling. We proposed range-based and region-based quantization to vary the error bounds in different regions and data ranges to further improve the compression performance.