csv.Error
PythonERRORCommonParsing
CSV parsing or writing error
Quick Answer
Wrap csv.reader() iteration in try/except csv.Error; use csv.field_size_limit() to raise the limit for large fields.
Production Risk
Medium — malformed CSV from external sources is common; always handle csv.Error in ETL pipelines.
What this means
Raised by the csv module when a parsing error occurs — typically a field that violates the dialect rules (e.g. an unterminated quoted field, or a line with more fields than the fieldnames in DictReader).
Why it happens
- 1Unterminated quoted field in the CSV data
- 2Field exceeds csv.field_size_limit() (default 131072 bytes)
- 3Binary data in a file opened in text mode with wrong encoding
Fix
Handle per-row and log bad lines
Handle per-row and log bad lines
import csv
with open('data.csv', newline='', encoding='utf-8') as f:
reader = csv.DictReader(f)
for i, row in enumerate(reader):
try:
process(row)
except csv.Error as e:
print(f'Line {reader.line_num}: {e}')Why this works
reader.line_num gives the CSV line number (not the file line number) for accurate error reporting.
Code examples
Raise field size limitpython
import csv, sys csv.field_size_limit(sys.maxsize) # allow arbitrarily large fields
Detect encoding issuespython
# Always open CSV files with explicit encoding and newline='':
with open('f.csv', newline='', encoding='utf-8-sig') as f:
reader = csv.reader(f)Same error in other languages
Sources
Official documentation ↗
Python Docs — csv module
Content generated with AI assistance and reviewed for accuracy. Found an error? hello@errcodes.dev