Skip to main content

Migrating Large Amounts of Data by Using Streams

  • Chapter
  • First Online:
The CLI Book
  • 972 Accesses

Abstract

In the preceding chapter, we made sure that our users will enjoy our CLI. We also published our first release. Now we want to take a look at a more advanced topic: streams. Streams are a very powerful feature in Node.js for processing large amounts of data. With traditional buffering, we quickly run into memory problems, because all the data just doesn’t fit into the memory of the computer. Streams enable us to process data in small slices. Node.js streams work like Unix streams on the terminal, where you pipe data from a producer into a consumer by using the pipe symbol (|). We will take a look at how piping works by exploring the cat command. Afterward, we will create our own streams and integrate them into our command-line client.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 14.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 19.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Robert Kowalski

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Kowalski, R. (2017). Migrating Large Amounts of Data by Using Streams. In: The CLI Book. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-3177-7_4

Download citation

Publish with us

Policies and ethics