This module will turn an array of tokens into an object.
Creates the line tokenizer and sets the options. The following options are supported: { "header" : ['a', 'b', 'c', undefined, 'e'], "strict" : false, "severity" : 'skip_record', "skip_first_row" : true }
opts(optional, default{})
Reads the stream data and split it into lines.
dataenccb
Adds an error to the stream data
dataThe current stream dataerrorThe error to be added.
Extends Interceptor
This interceptor cares about the handling of the messages. It will add the hops and copies the messages