gotchajavascriptMajor
Node.js streams: pipe without error handling loses errors
Viewed 0 times
Node.js 15+ for stream/promises
stream pipepipelineerror handlingbackpressurestream destroystream promises
nodejs
Error Messages
Problem
Using stream.pipe(dest) doesn't forward errors. If the source stream errors, the destination is not closed, causing memory leaks. Errors are silently lost.
Solution
Use pipeline() from stream/promises (Node 15+) or handle errors manually:
import { pipeline } from 'stream/promises';
import { createReadStream, createWriteStream } from 'fs';
import { createGzip } from 'zlib';
// Modern: pipeline handles errors and cleanup
await pipeline(
createReadStream('input.txt'),
createGzip(),
createWriteStream('output.gz')
);
// Manual error handling for .pipe()
const source = createReadStream('input.txt');
const dest = createWriteStream('output.txt');
source.on('error', (err) => dest.destroy(err));
dest.on('error', (err) => source.destroy());
source.pipe(dest);
import { pipeline } from 'stream/promises';
import { createReadStream, createWriteStream } from 'fs';
import { createGzip } from 'zlib';
// Modern: pipeline handles errors and cleanup
await pipeline(
createReadStream('input.txt'),
createGzip(),
createWriteStream('output.gz')
);
// Manual error handling for .pipe()
const source = createReadStream('input.txt');
const dest = createWriteStream('output.txt');
source.on('error', (err) => dest.destroy(err));
dest.on('error', (err) => source.destroy());
source.pipe(dest);
Why
pipe() only forwards data, not errors. It was designed before error handling patterns were standardized. pipeline() was added to fix this — it propagates errors through the entire chain and cleans up all streams.
Gotchas
- pipeline() destroys all streams on error — pipe() doesn't
- Backpressure is handled by pipe/pipeline — don't read faster than you write
- stream.finished() callback tells you when a stream is done or errored
- Always use { highWaterMark } option for controlling buffer size
Code Snippets
Safe stream pipeline
import { pipeline } from 'stream/promises';
try {
await pipeline(source, transform, destination);
console.log('Done');
} catch (err) {
console.error('Pipeline failed:', err);
// All streams are automatically cleaned up
}Context
When piping Node.js streams for file processing, compression, or HTTP
Revisions (0)
No revisions yet.