Describe the bug
When parsing a CSV with headers: true, if the number of columns in a data row does not match the number of headers (e.g. too many values), an error like the following is emitted:
Unexpected Error: column header mismatch expected: 2 columns got: 3
However, in such cases, the stream never emits the end event after the error, which causes Promise-based consumers or those relying on end to hang indefinitely.
Parsing or Formatting?
To Reproduce
sample.csv contents:
header1,header2
value1,value2,value3
- Reproduction code:
import fs from 'fs';
import { parse } from '@fast-csv/parse';
const rows = [];
const errors = [];
fs.createReadStream('sample.csv')
.pipe(parse({ headers: true }))
.on('error', (err) => {
console.error('Error:', err.message);
errors.push(err);
})
.on('data', (row) => {
rows.push(row);
})
.on('end', () => {
console.log('Stream ended');
console.log({ rows, errors });
});
This will emit the error, but end will not be called.
Expected behavior
In typical Node.js stream usage, it is often expected that end (or at least close) is emitted after an error, particularly when the stream can no longer continue. This ensures proper teardown for stream consumers.
Would it be reasonable for fast-csv to emit end (or auto-destroy the stream) after such a parsing error?
If so, I’d be happy to submit a PR to ensure the stream closes correctly after error.
Screenshots

Desktop (please complete the following information):
- OS: macOS
- OS Version: Sonoma 14.3
- Node Version: 22.9.0
Additional context
We encountered this while wrapping fast-csv parsing in a Promise-based function. When end never fires, the Promise is never resolved or rejected, causing test timeouts and unexpected hangs.
We attempted to resolve this by modifying CsvParserStream.ts to call this.push(null) or this.destroy(err) after emitting the error, which solved the issue locally.
We’re mainly opening this issue to confirm whether the current behavior is intended and to propose emitting end after such errors for better compatibility with stream consumers.
Describe the bug
When parsing a CSV with
headers: true, if the number of columns in a data row does not match the number of headers (e.g. too many values), an error like the following is emitted:However, in such cases, the stream never emits the
endevent after theerror, which causes Promise-based consumers or those relying onendto hang indefinitely.Parsing or Formatting?
To Reproduce
sample.csvcontents:This will emit the error, but
endwill not be called.Expected behavior
In typical Node.js stream usage, it is often expected that
end(or at leastclose) is emitted after an error, particularly when the stream can no longer continue. This ensures proper teardown for stream consumers.Would it be reasonable for fast-csv to emit
end(or auto-destroy the stream) after such a parsing error?If so, I’d be happy to submit a PR to ensure the stream closes correctly after
error.Screenshots

Desktop (please complete the following information):
Additional context
We encountered this while wrapping fast-csv parsing in a Promise-based function. When
endnever fires, the Promise is never resolved or rejected, causing test timeouts and unexpected hangs.We attempted to resolve this by modifying
CsvParserStream.tsto callthis.push(null)orthis.destroy(err)after emitting the error, which solved the issue locally.We’re mainly opening this issue to confirm whether the current behavior is intended and to propose emitting
endafter such errors for better compatibility with stream consumers.