ERR_WORKER_OUT_OF_MEMORY
Node.jsCRITICALCommonWorkerHIGH confidence

Worker thread ran out of memory

Production Risk

Worker crashes; if not supervised, the task is lost — implement restart/retry logic.

What this means

Thrown when a worker thread exhausts its allocated memory. Worker threads have independent V8 heaps and can be given individual memory limits via resourceLimits. When the worker exceeds its limit (or the system has no more memory), this error is emitted.

Why it happens
  1. 1Worker holds large data structures in memory without releasing them
  2. 2Memory leak in the worker's event processing loop
  3. 3resourceLimits.maxOldGenerationSizeMb set too low for the workload
How to reproduce

Triggered when the workers V8 heap allocation fails due to OOM.

trigger — this will error
trigger — this will error
const { Worker } = require('worker_threads');
const w = new Worker('./heavy-worker.js', {
  resourceLimits: { maxOldGenerationSizeMb: 16 }, // very low
});
w.on('error', (err) => {
  console.error(err.code); // ERR_WORKER_OUT_OF_MEMORY
});

expected output

Error [ERR_WORKER_OUT_OF_MEMORY]: Worker terminated due to reaching memory limit

Fix 1

Increase worker memory limits

WHEN When legitimate workloads need more memory

Increase worker memory limits
const w = new Worker('./worker.js', {
  resourceLimits: {
    maxOldGenerationSizeMb: 512,
    maxYoungGenerationSizeMb: 128,
  },
});

Why this works

Higher limits give the worker heap room for larger workloads.

Fix 2

Stream large data instead of holding it all in memory

WHEN When processing large datasets in a worker

Stream large data instead of holding it all in memory
// In the worker: process data in chunks
const { parentPort } = require('worker_threads');
parentPort.on('message', (chunk) => {
  const result = processChunk(chunk);
  parentPort.postMessage(result);
  // chunk and result are GC-eligible after this
});

Why this works

Processing data in chunks keeps peak memory usage low.

Code examples
Triggerjs
const { Worker } = require('worker_threads');
const w = new Worker('./heavy-worker.js', {
  resourceLimits: { maxOldGenerationSizeMb: 16 }, // very low
});
w.on('error', (err) => {
  console.error(err.code); // ERR_WORKER_OUT_OF_MEMORY  // this triggers ERR_WORKER_OUT_OF_MEMORY
Handle in try/catchjs
try {
  // operation that may throw ERR_WORKER_OUT_OF_MEMORY
  riskyOperation()
} catch (err) {
  if (err.code === 'ERR_WORKER_OUT_OF_MEMORY') {
    console.error('ERR_WORKER_OUT_OF_MEMORY:', err.message)
  } else {
    throw err
  }
}
Defensive pattern to avoid itjs
// Validate inputs before calling the operation
function safe_err_worker_out_of_memory(...args) {
  // validate args here
  return performOperation(...args)
}
What not to do

Hold large buffers or arrays in worker scope indefinitely

The worker heap is bounded; long-lived large objects fill it rapidly.

Sources
Official documentation ↗

Node.js Error Codes Documentation

Content generated with AI assistance and reviewed for accuracy. Found an error? hello@errcodes.dev

← All Node.js errors