Process Separate files into separate streams
Scenario
Very large quantities of data are required to be processed from a set of files. It is useful to be able to process each file into a separate stream, such that if we are processing 10 files and there is a failure in in the 9th file, we will not have to re-process the first 8 files.
...
- Create a stream connected to a file collector
- Ensure the file collector will collect multiple files when it is activated (i.e. ensure there are) multiple files that match the file collector criteria.
- Create pipe connecting the stream to itself.
- Set the pipe attribute : type to 'push'.
- Set the pipe attributes : From and To Date Offsets to 1.
See Also
...