Buffers, Streams, and the File System in Nodejs
We'll walk through Buffers, the File System (fs), and Streams, not as isolated APIs, but as ideas that work together. I'll keep the language simple, examples practical, and explanations grounded in how real systems behave.
Why This Matters
Node.js is different from traditional backend runtimes because:
It is single-threaded
It is event-driven
It is built for I/O
Most Node apps spend their time:
Reading files
Writing files
Sending data over the network
Receiving data from the network
Buffers, fs, and Streams are Node's core tools for moving data efficiently.
Part 1: Buffers
What is a Buffer?
A Buffer is Node.js's way of handling binary data.
Think of it like: "A box of raw bytes."
This is important because:
Files are bytes
Network packets are bytes
Images, videos, PDFs all are bytes
JavaScript itself wasn't designed for this kind of low-level data. So Node introduced Buffer.
const buf = Buffer.from("hello");
console.log(buf);
// Output
<Buffer 68 65 6c 6c 6f>
What's happening here?
Buffer.from("hello")takes the string "hello" and converts it into a sequence of bytesEach pair of characters (like
68,65) represents one byte in hexadecimal format68in hex = 104 in decimal = the letter 'h' in ASCII/UTF-865in hex = 101 in decimal = the letter 'e'And so on for
6c(l),6c(l),6f(o)
// Convert the buffer back to a human-readable string
console.log(buf.toString());
//Output
"hello"
What's happening here?
buf.toString()decodes those raw bytes back into a UTF-8 stringBy default, it assumes UTF-8 encoding
You could specify other encodings like
buf.toString('base64')orbuf.toString('hex')
Buffers Are Fixed-Size
const buf = Buffer.alloc(5);
buf.write("hello world");
console.log(buf.toString());
// output
"hello"
What's happening here?
Buffer.alloc(5)allocates exactly 5 bytes of memory, initialized to zerosbuf.write("hello world")attempts to write the full stringBut the buffer only has room for 5 bytes, so it only writes "hello"
The rest (" world") is silently ignored
This is not an error, it's by design
I’m gonna tell you why?
Buffers have fixed memory
They do not auto-expand
This is intentional. It makes memory usage predictable and prevents runaway memory growth
Part 2: The File System (fs)
Now let's talk about fs. At a high level the fs module lets Node talk to your hard disk. But the way you talk to the disk matters a lot.
Sync vs Async
// sync file operations
const fs = require("fs");
const data = fs.readFileSync("file.txt", "utf8");
console.log(data);
What's happening here?
require("fs")loads Node's built-in file system modulefs.readFileSync("file.txt", "utf8")tells the operating system to read "file.txt"The
Syncsuffix means synchronous, node waits right hereWhile waiting, nothing else runs, the event loop is frozen
Only when the OS finishes reading does execution continue
"utf8"tells Node to decode the raw bytes into a UTF-8 string
This blocks the event loop.
// async file operation
const fs = require("fs");
fs.readFile("file.txt", "utf8", (err, data) => {
if (err) throw err;
console.log(data);
});
console.log("This prints first!");
What's happening here?
fs.readFile()is the async version, noSyncsuffixNode asks the OS for the file, then moves on immediately
The third argument is a callback function that runs later
When the OS finishes reading, Node puts the callback in the event loop queue
The callback receives two arguments:
err: an error object if something went wrong, otherwisenulldata: the file contents as a string (because we specified"utf8")
Under the Hood
Node does not read the file itself. It:
Delegates to the OS
Registers a callback
Keeps the event loop free
This is why Node scales so well with I/O-heavy workloads.
But There's a Hidden Problem…
Both readFileSync and readFile do something dangerous:
They load the entire file into memory.
That's fine for:
Small files
JSON configs
Logs
But terrible for:
Large files
Videos
Data pipelines
Part 3: Streams
What is a Stream?
A stream is data that:
Arrives in chunks
Over time
Piece by piece
Think: Water flowing through a pipe, not a bucket dumped on your head.
Why Streams Exist
Imagine reading a 5GB file.
Option 1: Load everything into memory ❌
Option 2: Read 64KB at a time, process it, move on ✅
Streams are Option 2.
Types of Streams in Nodejs
There are four types, but we'll focus on three:
Readable - produces data
Writable - consumes data
Transform - modifies data in between
Readable Streams
const fs = require("fs");
const stream = fs.createReadStream("bigfile.txt", {
encoding: "utf8",
});
stream.on("data", chunk => {
console.log("Received chunk:", chunk.length);
});
stream.on("end", () => {
console.log("Done reading file");
});
What's happening here?
fs.createReadStream("bigfile.txt", {...})creates a stream objectInstead of reading the whole file, it opens it and prepares to read chunks
The
encoding: "utf8"option means chunks will be strings, not Buffersstream.on("data", ...)listens for thedataeventEvery time the OS delivers a chunk (usually ~64KB), the callback runs
chunkis the data, either a string (if encoding is set) or a Bufferstream.on("end", ...)listens for theendevent, which fires when done
Writable Streams
Writable streams consume data.
const fs = require("fs");
const writeStream = fs.createWriteStream("output.txt");
writeStream.write("Hello\n");
writeStream.write("World\n");
writeStream.end();
What's happening here?
fs.createWriteStream("output.txt")creates a stream that writes to "output.txt"writeStream.write("Hello\n")puts data into the stream's internal bufferNode will flush this to disk asynchronously
You can call
write()multiple timeswriteStream.end()tells Node "no more data coming", it will finish writing and close the file
Writable Streams
Writable streams consume data.
const fs = require("fs");
const writeStream = fs.createWriteStream("output.txt");
writeStream.write("Hello\n");
writeStream.write("World\n");
writeStream.end();
What's happening here?
fs.createWriteStream("output.txt")creates a stream that writes to "output.txt"writeStream.write("Hello\n")puts data into the stream's internal bufferNode will flush this to disk asynchronously
You can call
write()multiple timeswriteStream.end()tells Node "no more data coming", it will finish writing and close the fileData flows into the stream.
mastering Buffers, the File System, and Streams in Node.js is essential for efficient data handling. Buffers manage binary data, crucial for files and network packets.