Most of things in NodeJS are streams...
Streams are a fundamental concept in Node.js, allowing data to be read or written in chunks instead of loading everything into memory at once. This makes them especially useful for handling large files or continuous data.
Examples of streams in Node.js include:
- HTTP requests and responses: When a server processes an HTTP request, it works as a readable stream. The response it sends back is a writable stream.
- File operations: Reading from or writing to files using the
fs
module can be handled as streams. - Network communications: Sockets use streams to send and receive data.
How Many Things in Node.js Are Streams?
Node.js provides a variety of streams, which are core building blocks for handling data flow. These streams are categorized into four main types: Readable, Writable, Duplex, and Transform.
1. Readable Streams
Streams from which data can be read.
fs.createReadStream()
for reading files.- HTTP request (
http.IncomingMessage
). - Process standard input (
process.stdin
). - Network socket (
net.Socket
) in read mode.
2. Writable Streams
Streams to which data can be written.
fs.createWriteStream()
for writing to files.- HTTP response (
http.ServerResponse
). - Process standard output and error (
process.stdout
andprocess.stderr
). - Network socket (
net.Socket
) in write mode.
3. Duplex Streams
Streams that are both readable and writable.
net.Socket
(TCP socket connection).zlib
compression streams (e.g.,zlib.createGzip()
).stream.Duplex
for custom implementations.
4. Transform Streams
Special duplex streams that can modify or transform data as it is written and read.
zlib.createGzip()
orzlib.createGunzip()
for compression and decompression.crypto
streams likecrypto.createCipher()
orcrypto.createDecipher()
.stream.Transform
for custom transformations.
Other Notable Stream Implementations
- File System (fs): Readable and writable streams for file operations.
- HTTP:
- Incoming requests (Readable stream).
- Server responses (Writable stream).
- Child Processes:
child_process.spawn()
and related methods provide streams forstdin
,stdout
, andstderr
. - Streams in Libraries: Streams used in third-party libraries like
axios
orrequest
for handling data. - WebSocket Streams: Some libraries like
ws
orSocket.io
use streams for real-time communication.
While there is no single definitive number because streams can be custom-implemented, the core Node.js API has several dozen implementations of streams across various modules.
Why Use Streams for Large CSV Files?
Processing large CSV files can be memory-intensive if the entire file is read into memory at once. By using streams, you can process the file line by line or chunk by chunk, keeping memory usage low and improving performance.
Reading a Large CSV File
Here is an example of how to read a large CSV file using streams:
const fs = require('fs');
const readline = require('readline');
const readStream = fs.createReadStream('largefile.csv');
const rl = readline.createInterface({ input: readStream });
rl.on('line', (line) => {
console.log(`Line: ${line}`);
});
rl.on('close', () => {
console.log('Finished reading the file.');
});
In this example, the fs.createReadStream
method reads the file in chunks, and the readline
module processes each line.
Writing a Large CSV File
Here is how you can write to a CSV file using streams:
const fs = require('fs');
const writeStream = fs.createWriteStream('output.csv');
writeStream.write('Name,Age,Location\n');
writeStream.write('John,30,New York\n');
writeStream.write('Jane,25,London\n');
writeStream.end(() => {
console.log('Finished writing to the file.');
});
The fs.createWriteStream
method allows data to be written in chunks to the file.
Transforming Data with Streams
Sometimes, you may want to transform data while reading or writing. This can be done using transform streams:
const fs = require('fs');
const { Transform } = require('stream');
const readStream = fs.createReadStream('largefile.csv');
const writeStream = fs.createWriteStream('output.csv');
const transformStream = new Transform({
transform(chunk, encoding, callback) {
const modifiedChunk = chunk.toString().toUpperCase();
callback(null, modifiedChunk);
}
});
readStream.pipe(transformStream).pipe(writeStream);
In this example, the transform stream converts all data to uppercase before writing it to the output file.
Benefits of Streams
- Efficient memory usage
- Faster processing for large data
- Allows for real-time data processing
Join the conversation