Data compression is essential to large-scale data centers to save both storage and network bandwidth. Current software based method suffers from high computational cost with limited performance. In this project, we are migrating the fundamental workload of the computer system to FPGA accelerator, aiming high throughput performance and high energy efficiency, as well as freeing some CPU resources.
Xpress Compression Algorithm is Microsoft compression format that combines the dictionary based LZ77 method and Huffman encoding, similar to popular GZIP compression. Xpress9 is an advanced branch of Xpress family algorithms targeting higher compression ratio with more optimization on both stages.
The system architecture supports up to 128 multi-threaded compression contexts with custom PCIe interface and queue managements. We integrate up to 7 compression engines on Altera Stratix D5 FPGA each of which accelerates full features of Xpress9 algorithm. The hardware scheduler maximizes throughput performance of the engines.
Results & Future Projection
The proposed hardware compressor achieved 6% better and 30x more throughput than software based GZIP compression with level 9 (best) optimization on a single Zeon core. We are also targeting other compression domains such as low compression - high throughput, high compression - low throughput to push this Pareto curve of compression ratio vs throughput with hardware acceleration.
Scott Hauck, University of Washington as a visiting researcher
Jinwook Oh, IBM TJ Watson Research Center as an intern (Sep 2012)
Janarbek Matai, UC San Diego as an intern (June 2013)
- Janarbek Matai, Joo-Young Kim, and Ryan Kastner, Energy Efficient Canonical Huffman Encoding, in The 25th IEEE International Conference on Application-specific Systems, Architectures and Processors , 18 June 2014.
- Joo-Young Kim, Scott Hauck, and Doug Burger, A Scalable Multi-engine Xpress9 Compressor with Asynchronous Data Transfer, IEEE 22nd International Symposium on Field-Programmable Custom Computing Machines, 11 May 2014.