Modified Huffman Code for Bandwidth Optimization Through Lossless Compression

  • Alexander Hansen
  • Mark C. Lewis
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 738)


In the interest of minimizing bandwidth usage, a modified Huffman code structure is proposed, with an accompanying algorithm, to achieve excellent lossless compression ratios while maintaining a quick compression and decompression process. This is important as the usage of internet bandwidth increases greatly with each passing year, and other existing compression models are either too slow, or not efficient enough. We then implement this data structure and algorithm using English text compression as the data and discuss its application to other data types. We conclude that if this algorithm were to be adopted by browsers and web servers, bandwidth usage could be reduced significantly, resulting in cut costs and a faster internet.


Compression Bandwidth Internet Lossless Space optimization 


  1. 1.
    Cisco Systems. Visual Networking Index. Available at:, Accessed 10 Mar 2018
  2. 2.
    M.J. Weinberger, G. Seroussi, G. Sapiro, LOCO-I: a low complexity, context-based, loss-less image compression algorithm, in Proceedings Data Compression Conference, 1996, pp. 140–149 (1996). CrossRefGoogle Scholar
  3. 3.
    C. Hong-Chung, W. Yue-Li, L. Yu-Feng, Memory-efficient and fast Huffman decoding algorithm. Inf. Process. Lett. 69(3), 119–122 (1999). ISSN: 0020–0190.

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Trinity UniversitySan AntonioUSA

Personalised recommendations