Abstract
In the preceding chapter, we made sure that our users will enjoy our CLI. We also published our first release. Now we want to take a look at a more advanced topic: streams. Streams are a very powerful feature in Node.js for processing large amounts of data. With traditional buffering, we quickly run into memory problems, because all the data just doesn’t fit into the memory of the computer. Streams enable us to process data in small slices. Node.js streams work like Unix streams on the terminal, where you pipe data from a producer into a consumer by using the pipe symbol (|). We will take a look at how piping works by exploring the cat command. Afterward, we will create our own streams and integrate them into our command-line client.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2017 Robert Kowalski
About this chapter
Cite this chapter
Kowalski, R. (2017). Migrating Large Amounts of Data by Using Streams. In: The CLI Book. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-3177-7_4
Download citation
DOI: https://doi.org/10.1007/978-1-4842-3177-7_4
Published:
Publisher Name: Apress, Berkeley, CA
Print ISBN: 978-1-4842-3176-0
Online ISBN: 978-1-4842-3177-7
eBook Packages: Professional and Applied ComputingProfessional and Applied Computing (R0)Apress Access Books