PhixFlow Help

Database Exporter

A Database Exporter allows data held in a Stream to be written to a Datasource.

The toolbar has the standard iconsFor information about the sections Parent Details, Analysis Models, Description and Audit Summary, see Common Properties.  For information about other property tabs, see Property Tabs.

The following fields are configured for database exporters:

FieldDescription
NameName of the Database Exporter.
EnabledTick when the configuration is complete and the Database Exporter is ready to be used.
DatasourceThe Datasource that this Database Exporter will write to.
Statement ExpressionThe query that will be used to write to the target. See Database Exporter Query String for details.
PL/SQLIf ticked, a PL/SQL anonymous block can be used to write to the target Datasource.
Inputs
A list of input pipes for this Database Exporter.
Input Multiplier
Input Multiplier

This expression works in the same way as the Input Multiplier Expressions on Streams and File Exporters. A complete export is carried out for each value in the list of values returned by the Input Multiplier Expression. If the Input Multiplier Expression evaluates to an empty list then no export will be carried out. If no Input Multiplier Expression is set, a single export will be carried out.

Analysis Models
A list of Analysis Models containing this Database Exporter.
Advanced
Fields to Update in Source Stream
Record Export TimeIf required, select an attribute from the input Stream from the drop down list. If an attribute is selected, when the exporter writes a record to the target Datasource the selected attribute in the input record will be updated with the export time.
If a record in the input Stream is not exported, this attribute will be left blank.
If a record is exported - but the export is rolled back - the attribute will be updated, and set back to a blank record.
Warning: if a record fails to export then this attribute will still be populated, even in the case that the export transaction in the target Datasource is rolled back.
Record Export StatusIf required, select an attribute from the input Stream from the drop down list. If an attribute is selected, when the exporter writes a record to the target Datasource the selected attribute in the input record will be updated with the export status (success or fail).
If a record in the input Stream is not exported, this attribute will be left blank.
If a record is exported - but the export is rolled back - the attribute will be updated, and set back to a blank record.
Warning: if a record fails to export then this attribute will still be populated, even in the case that the export transaction in the target Datasource is rolled back.
Record ResponseIf required, select an attribute from the input Stream from the drop down list. If an attribute is selected, when the exporter writes a record to the target Datasource the selected attribute in the input record will be updated with any export error messages.
If a record in the input Stream is not exported, this attribute will be left blank.
If a record is exported - but the export is rolled back - the attribute will be updated, and set back to a blank record.
Warning: if a record fails to export then this attribute will still be populated, even in the case that the export transaction in the target Datasource is rolled back.
Exporter Performance Settings
Parallel Exporters

If required, you can specify that the exporter should use a number of parallel exporters. Each buffer of records read from the input pipe will be divided across the parallel exporters. The restriction applied to this is that if Group By fields are specified in the input Pipe to the exporter, all records with the same Group By key will be exported by the same exporter. (This is to help reduce contention in the target Datasource, writing to tables or calling APIs.)

If no value is specified here then only a single exporter will be used.

The Commit Size - also set on this tab - is per exporter i.e. each exporter will commit separately whenever it exports the number of records specified as the commit size

Errors Before Rollback

If required, you can set the maximum number of errors found, while exporting across all exporters, before the exporter will rollback all uncommitted transactions from all exporters.
If no value or zero is specified, the exporter will rollback after one error.  
If 1 is specified, a single error will cause a rollback.

Up to this number of errors, any errors reported from the exporter will appear as warnings in the log entries. The export will not be considered failed unless the number of errors is above this threshold.

Commit SizeTells the exporter to commit records in batches of the specified size. If no value is specified, all records will be committed as a single batch.
Errors Before StoppingIf required, you can set the maximum number of errors found, while exporting across all exporters, before the exporter will stop attempting to export records to the Datasource.
This is useful where you have set Errors Before Rollback (also configured on this tab), but wish to continue to try to export records to get view of how successful the remaining export would have been. Any records that the exporter attempts to export after the Errors Before Rollback limit has been passed will always be rolled back, even if they are successful.
If no value is entered, the exporter will stop when the Errors Before Rollback limit is reached.
Timeout (secs)

Tells the exporter to timeout each export operation after a number of seconds. If an export operation does time out, the operation will be abandoned and an error will be reported. Each record that is timed out is considered to be a single error when assessing whether the entire export should be stopped and/or rolled back - see Error Before Rollback below.

If no value is specified, the duration of an export operation is unlimited.

Datasource Instance ExpressionThe target datasource for this exporter may list multiple Datasource Instances. Each datasource instance is identified by a unique string. This expression should evaluate to a string which allows the Database Exporter to determine the instance to use. If no value is entered then the exporter will assume that there is only one instance and will use that one by default. If there is more than one instance and no expression is provided here then an error will be thrown during analysis.
Description
DescriptionDescription of the database exporter.

Database Exporter Query String

Query String Description

Database Exporters can be set up to update a database using either a plain SQL query or PL/SQL. If you want to use PL/SQL tick the box Uses PL/SQL.

Plain SQL query

Enter an insert or update query just as you would to edit the database directly. The only important difference when creating queries in PhixFlow is that expressions must be enclosed in curly braces i.e. start with a '{' and end with '}'. This includes referencing an attribute name from the input Stream - which is just a simple expression.

Note that when exporting a stream attribute field of Type 'TrueFalse', this attribute must be mapped correctly to the corresponding underlying database column it is being exported into. An 'if' style expression should be used in the query string to transform the TrueFalse value into the appropriate database column. Mapping to a single character database column - {if(aTrueFalseStreamAttribute,'T','F')}
Mapping to a single numeric database column - {if(aTrueFalseStreamAttribute,1,0)}

The other difference between entering a query in PhixFlow and submitting a query to the database directly is that you must not include the standard SQL statement terminator, e.g. a semi-colon, at the end of your query.

insert into INTL_CODES values({Operator}, {Code})

If we have an input Stream containing a list of international codes, with attributes Operator and Code, this query will insert all data from the input Stream into a database table INTL_CODES.

insert into INTL_CODES values({Operator}, {Code}, {toDate(StartDate)})

The table INTL_CODES also includes a date, START_DATE, which we will populate using the attribute StartDate in the input Stream. In the Stream, this date is stored as a string in the format YYYYMMDD. In the query above, we have converted this string to a date using a PhixFlow expression.

insert into INTL_CODES values({Operator}, {Code}, to_date({StartDate}, 'yyyymmdd'))

This query performs the same inserts as the previous example, but in this case we are converting the StartDate string value to a date using a database function.

PL/SQL

As for a plain SQL query, you can write PL/SQL just as you would to run against the database directly - but putting any PhixFlow expressions in curly braces i.e. starting with a '{' and ending with '}'.

declare v_count number;
begin
	select count (1)
	into v_count
	from INTL_CODES
	where operator = {Operator} and code = {Code};
 
	if (v_count = 0)
	then
		insert into INTL_CODES
		values( {Operator}, {Code}, {toDate(StartDate)} );
	else
		update INTL_CODES set START_DATE = {toDate(StartDate)}
		where operator = {Operator} and code = {Code};
	end if;
end;

In this example we are loading international codes into a database, as in the plain SQL examples. Here, we want to insert a new value if no value already exists in the table for the Operator/Code combination; otherwise we will edit the existing record with the START_DATE of the current value that we have read from the input stream.

Please let us know if we could improve this page feedback@phixflow.com