ERR_CHILD_PROCESS_STDIO_MAXBUFFER
Node.jsERRORCriticalChild ProcessHIGH confidence

The stdio buffer of a child process exceeded its maximum size.

Production Risk

High. This can cause the application to crash unexpectedly if a child process produces more output than anticipated. It's a sign of using the wrong tool (`exec`) for the job.

What this means

This error occurs when using `child_process.exec()` or `child_process.execSync()` and the child process produces more output on stdout or stderr than the configured `maxBuffer` size can hold. These functions buffer the entire output in memory. To prevent running out of memory, Node.js imposes a limit, which defaults to 1MB.

Why it happens
  1. 1A child process generates a very large amount of output.
  2. 2The `maxBuffer` option is set too low for a process that produces moderate output.
  3. 3A process that is not intended to be buffered (like a long-running server) is started with `exec`.
How to reproduce

This error is thrown when the internal buffer for a child process's stdout or stderr stream fills up and exceeds the `maxBuffer` limit.

trigger — this will error
trigger — this will error
const { exec } = require('child_process');

// This command will output a large amount of data.
// It will likely exceed the default 1MB buffer.
exec('cat /dev/zero', (err, stdout, stderr) => {
  if (err) {
    console.error(err.code);
  }
});

expected output

ERR_CHILD_PROCESS_STDIO_MAXBUFFER

Fix 1

Increase the maxBuffer Size

WHEN The process output is legitimately large but manageable.

Increase the maxBuffer Size
const { exec } = require('child_process');

// Set the maxBuffer to 10MB.
exec('some-command-with-large-output', { maxBuffer: 1024 * 1024 * 10 }, (err, stdout, stderr) => {
  // ...
});

Why this works

If you expect more output than the default, you can provide a larger `maxBuffer` value in the options object. Be careful not to set it so large that you risk running out of system memory.

Fix 2

Use spawn() Instead

WHEN The process output is very large, is a continuous stream, or does not need to be buffered.

Use spawn() Instead
const { spawn } = require('child_process');

const child = spawn('find', ['/']);

child.stdout.on('data', (data) => {
  console.log('Received chunk: ' + data);
});

child.on('close', (code) => {
  console.log('child process exited with code ' + code);
});

Why this works

`spawn()` does not buffer output. Instead, it provides streams for stdout and stderr that you can process in chunks. This is the most robust method for handling child processes with significant output.

Code examples
Triggerjs
const { exec } = require('child_process');

// This command will output a large amount of data.
// It will likely exceed the default 1MB buffer.
exec('cat /dev/zero', (err, stdout, stderr) => {
  if (err) {  // this triggers ERR_CHILD_PROCESS_STDIO_MAXBUFFER
Handle in try/catchjs
try {
  // operation that may throw ERR_CHILD_PROCESS_STDIO_MAXBUFFER
  riskyOperation()
} catch (err) {
  if (err.code === 'ERR_CHILD_PROCESS_STDIO_MAXBUFFER') {
    console.error('ERR_CHILD_PROCESS_STDIO_MAXBUFFER:', err.message)
  } else {
    throw err
  }
}
Defensive pattern to avoid itjs
const { spawn } = require('child_process')
const child = spawn('find', ['/'])
child.stdout.on('data', chunk => console.log(chunk.toString()))
What not to do

Sources
Official documentation ↗

https://github.com/nodejs/node/blob/main/lib/child_process.js

More information

Content generated with AI assistance and reviewed for accuracy. Found an error? hello@errcodes.dev

← All Node.js errors