ERR_FS_FILE_TOO_LARGE
Node.jsERRORCriticalFile SystemHIGH confidence

A file is too large to be processed.

Production Risk

High. This error can crash the application when it encounters an unexpectedly large file. Using synchronous I/O for large files also blocks the event loop, hurting performance.

What this means

This error occurs when a file system operation attempts to process a file that exceeds a size limit. This is most commonly seen with `fs.readFileSync()` or `fs.readFile()`, which try to load the entire file into a single buffer in memory. Node.js has a built-in buffer size limit (around 2GB on 64-bit systems) to prevent applications from crashing due to memory exhaustion.

Why it happens
  1. 1Attempting to read a multi-gigabyte file into memory with `fs.readFile()`.
  2. 2A file that is expected to be small grows unexpectedly large.
  3. 3Running on a system with insufficient memory to hold the file's contents.
How to reproduce

This error is thrown by the `fs` module when it tries to allocate a buffer to hold a file's contents, but the file's size exceeds the maximum possible buffer size.

trigger — this will error
trigger — this will error
const fs = require('fs');
// Create a dummy large file (e.g., 3GB).
// This will throw because the file is too large to fit in a buffer.
try {
  const data = fs.readFileSync('my-large-file.bin');
} catch (err) {
  console.error(err.code);
}

expected output

ERR_FS_FILE_TOO_LARGE

Fix

Use Streams for Large Files

WHEN Reading or writing files that are large or of unknown size.

Use Streams for Large Files
const fs = require('fs');

const readStream = fs.createReadStream('my-large-file.bin');

readStream.on('data', (chunk) => {
  console.log('Received ' + chunk.length + ' bytes of data.');
  // Process the chunk here.
});

readStream.on('end', () => {
  console.log('Finished reading the file.');
});

Why this works

Instead of loading the entire file into memory, use `fs.createReadStream()` to read it in smaller, manageable chunks. This allows you to process files of any size with a small, constant amount of memory.

Code examples
Triggerjs
const fs = require('fs');
// Create a dummy large file (e.g., 3GB).
// This will throw because the file is too large to fit in a buffer.
try {
  const data = fs.readFileSync('my-large-file.bin');
} catch (err) {  // this triggers ERR_FS_FILE_TOO_LARGE
Handle in try/catchjs
try {
  // operation that may throw ERR_FS_FILE_TOO_LARGE
  riskyOperation()
} catch (err) {
  if (err.code === 'ERR_FS_FILE_TOO_LARGE') {
    console.error('ERR_FS_FILE_TOO_LARGE:', err.message)
  } else {
    throw err
  }
}
Defensive pattern to avoid itjs
const fs = require('fs')
const stream = fs.createReadStream('./large-file')
stream.on('data', chunk => process(chunk))
What not to do

Same error in other languages
Sources
Official documentation ↗

https://github.com/nodejs/node/blob/main/src/node_file.cc

More information

Content generated with AI assistance and reviewed for accuracy. Found an error? hello@errcodes.dev

← All Node.js errors