A file is too large to be processed.
Production Risk
High. This error can crash the application when it encounters an unexpectedly large file. Using synchronous I/O for large files also blocks the event loop, hurting performance.
This error occurs when a file system operation attempts to process a file that exceeds a size limit. This is most commonly seen with `fs.readFileSync()` or `fs.readFile()`, which try to load the entire file into a single buffer in memory. Node.js has a built-in buffer size limit (around 2GB on 64-bit systems) to prevent applications from crashing due to memory exhaustion.
- 1Attempting to read a multi-gigabyte file into memory with `fs.readFile()`.
- 2A file that is expected to be small grows unexpectedly large.
- 3Running on a system with insufficient memory to hold the file's contents.
This error is thrown by the `fs` module when it tries to allocate a buffer to hold a file's contents, but the file's size exceeds the maximum possible buffer size.
const fs = require('fs');
// Create a dummy large file (e.g., 3GB).
// This will throw because the file is too large to fit in a buffer.
try {
const data = fs.readFileSync('my-large-file.bin');
} catch (err) {
console.error(err.code);
}expected output
ERR_FS_FILE_TOO_LARGE
Fix
Use Streams for Large Files
WHEN Reading or writing files that are large or of unknown size.
const fs = require('fs');
const readStream = fs.createReadStream('my-large-file.bin');
readStream.on('data', (chunk) => {
console.log('Received ' + chunk.length + ' bytes of data.');
// Process the chunk here.
});
readStream.on('end', () => {
console.log('Finished reading the file.');
});Why this works
Instead of loading the entire file into memory, use `fs.createReadStream()` to read it in smaller, manageable chunks. This allows you to process files of any size with a small, constant amount of memory.
const fs = require('fs');
// Create a dummy large file (e.g., 3GB).
// This will throw because the file is too large to fit in a buffer.
try {
const data = fs.readFileSync('my-large-file.bin');
} catch (err) { // this triggers ERR_FS_FILE_TOO_LARGEtry {
// operation that may throw ERR_FS_FILE_TOO_LARGE
riskyOperation()
} catch (err) {
if (err.code === 'ERR_FS_FILE_TOO_LARGE') {
console.error('ERR_FS_FILE_TOO_LARGE:', err.message)
} else {
throw err
}
}const fs = require('fs')
const stream = fs.createReadStream('./large-file')
stream.on('data', chunk => process(chunk))✕
Content generated with AI assistance and reviewed for accuracy. Found an error? hello@errcodes.dev