This is because in the HTTP case, we basically read from one object ( http.IncomingMessage) and write to the other ( http.ServerResponse).Īlso note how the stdio streams ( stdin, stdout, stderr) have the inverse stream types when it comes to child processes. While an HTTP response is a readable stream on the client, it’s a writable stream on the server. Notice that the objects are also closely related. Some of these objects are both readable and writable streams, like TCP sockets, zlib and crypto streams. The list above has some examples for native Node.js objects that are also readable and writable streams. Many of the built-in modules in Node implement the streaming interface: Screenshot captured from my Pluralsight course - Advanced Node.js A stream for the grep outputĬonst wc =. Composability with Linux commands const grep =. Just like we can compose powerful linux commands by piping other smaller Linux commands, we can do exactly the same in Node with streams. They also give us the power of composability in our code. However, streams are not only about working with big data. This makes streams really powerful when working with large amounts of data, or data that’s coming from an external source one chunk at a time. The difference is that streams might not be available all at once, and they don’t have to fit in memory. Streams are collections of data - just like arrays or strings. “Streams are Node’s best and most misunderstood idea.” But in this article, I’m going to focus on the native Node.js stream API. Over the years, developers created lots of packages out there with the sole purpose of making working with streams easier. Well I’ve got good news for you - that’s no longer the case. Node.js streams have a reputation for being hard to work with, and even harder to understand. Read the updated version of this content and more about Node at /node-beyond-basics.