Understanding Streams in Node.js

Streams in Node.js have a reputation for being hard to work with, and even harder to understand.

Streams are Node’s best and most misunderstood idea.

What are streams?

  • Streams are one of the fundamental concepts that power Node.js applications.
  • They are data-handling method and are used to read or write input into output sequentially.
  • Streams are a way to handle reading/writing files, network communications, or any kind of end-to-end information exchange in an efficient way.
  • What makes streams unique, is that instead of a program reading a file into memory all at once like in the traditional way, streams read chunks of data piece by piece, processing its content without keeping it all in memory.
  • This makes streams really powerful when working with large amounts of data, for example, a file size can be larger than your free memory space, making it impossible to read the whole file into the memory in order to process it. That’s where streams come to the rescue!
  • Using streams to process smaller chunks of data, makes it possible to read larger files.

Example :

Let’s take a “streaming” services such as YouTube or Netflix : these services don’t make you download the video and audio feed all at once. Instead, your browser receives the video as a continuous flow of chunks, allowing the recipients to start watching and/or listening almost immediately. However, streams are not only about working with media or big data. They also give us the power of ‘composability’ in our code. Designing with composability in mind means several components can be combined in a certain way to produce the same type of result. In Node.js it’s possible to compose powerful pieces of code by piping data to and from other smaller pieces of code, using streams.

Why streams ?

  • Streams basically provide two major advantages compared to other data handling methods:

1. Memory efficiency: you don’t need to load large amounts of data in memory before you are able to process it

2. Time efficiency: it takes significantly less time to start processing data as soon as you have it, rather than having to wait with processing until the entire payload has been transmitted

There are 4 types of streams in Node.js :

Writable: streams to which we can write data. For example, fs.createWriteStream() lets us write data to a file using streams.

Readable: streams from which data can be read. For example: fs.createReadStream() lets us read the contents of a file.

Duplex: streams that are both Readable and Writable. For example, net.Socket

Transform: streams that can modify or transform the data as it is written and read. For example, in the instance of file-compression, you can write compressed data and read decompressed data to and from a file.

Conclusion :

If you have already worked with Node.js, you may have come across streams. For example, in a Node.js based HTTP server, request is a readable stream and response is a writable stream. You might have used the fs module, which lets you work with both readable and writable file streams. Whenever you’re using Express you are using streams to interact with the client, also, streams are being used in every database connection driver that you can work with, because of TCP sockets, TLS stack and other connections are all based on Node.js streams.

Leave a Reply

Your email address will not be published. Required fields are marked *


4th Floor, Plot no. 57, Dwaraka Central Building, Hitech City Rd, VIP Hills, Jaihind Enclave, Madhapur, Hyderabad, Telangana 500081


Mon – Fri: 09:00 – 19:00 Hrs

Overseas Office


102 S Tejon Street suite #1100, Colorado spring, CO 80903

+1 845-240-1734

Copyright © 2014 – 2023 by Sciens Technologies. All Rights Reserved.