Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

A database exporter allows data held in a stream to be written to a datasource.

Database Exporter Properties

Insert excerpt
_standard_settings
_standard_settings
nopaneltrue


Panel
borderColor#7da054
titleColorwhite
titleBGColor#7da054
borderStylesolid
titleSections on this page

Table of Contents
indent12px
stylenone


Basic Settings

FieldDescription
NameEnter the name of the database exporter.
Enabled

Insert excerpt
_check_box_ticked
_check_box_ticked
nopaneltrue
 when the configuration is complete and the database exporter is ready to be used.

DatasourceSelect a datasource (database) to which this database exporter will write.
Statement ExpressionEnter an expression to generate the query that writes data to the target datasource. See Database Exporter ; see Statement Expression Query String, below, for details.
PL/SQL

Insert excerpt
_check_box_ticked
_check_box_ticked
nopaneltrue
 to use a PL/SQL anonymous block  to write to the target datasource.

Inputs

A list of input pipes for this Database Exporter.

This section has a toolbar with 

Insert excerpt
_list_show
_list_show
nopaneltrue
 and 
Insert excerpt
_refresh
_refresh
nopaneltrue
; see standard buttons.

The grid contains a list of input pipes that connect to this datasource.

Input Multiplier

FieldDescription
Input Multiplier

Enter an expression that multiplies the data.  This works in the same way as the Input Multiplier Expressions on Streams and File Exporters. A complete export is carried out for each value in the list of values returned by this expression. If the expression evaluates to an empty list, then no export will be carried out. If no expression is set, a single export will be carried out.

Advanced

FieldDescriptionNotes
Fields to Update in Source Stream
Record Export Time

Optionally, select an attribute from the input stream. When the exporter writes a record to the target datasource, the selected attribute in the input record will be updated with the export time.

If a record in the input stream:

  • is not exported, this attribute will be left blank.
  • is exported but the export is subsequently rolled back, the attribute will be updated and set back to a blank record.
Warning

If a record fails to export then the selected attribute will still be updated. This happens even if the export transaction in the target datasource is rolled back.


Record Export Status

Optionally, select an attribute from the input stream. When the exporter writes a record to the target datasource, the selected attribute in the input record will be updated with the export status: either success or fail.

Record Response

Optionally, select an attribute from the input stream. When the exporter writes a record to the target datasource, the selected attribute in the input record will be updated with any export error messages.

Exporter Performance Settings
Parallel Exporters

Optionally specify that the exporter should use a number of parallel exporters.

Each buffer of records read from the input pipe is divided across the parallel exporters. The restriction applied to this is that if Group By fields are specified in the input Pipe to the exporter, all records with the same Group By key will be exported by the same exporter. (This is to help reduce contention in the target datasource, writing to tables or calling APIs.)

If no value is specified here then only a single exporter will be used.

The Commit Size (see below) is per exporter. This means each exporter will commit separately whenever it exports the number of records specified as the commit size

Errors Before Rollback

Optionally set the maximum number of errors found, while exporting across all exporters, before the exporter will rollback all uncommitted transactions from all exporters.
If no value or zero is specified, the exporter will rollback after one error.  
If 1 is specified, a single error will cause a rollback.

Up to this number of errors, any errors reported from the exporter will appear as warnings in the log entries. The export will not be considered failed unless the number of errors is above this threshold.

Commit SizeEnter the maximum size. The exporter to commit records in batches of the specified size. If no value is specified, all records will be committed as a single batch.
Errors Before StoppingOptionally set the maximum number of errors found, while exporting across all exporters, before the exporter will stop attempting to export records to the Datasource.
This is useful where you have set Errors Before Rollback (also configured on this tab), but wish to continue to try to export records to get view of how successful the remaining export would have been. Any records that the exporter attempts to export after the Errors Before Rollback limit has been passed will always be rolled back, even if they are successful.
If no value is entered, the exporter will stop when the Errors Before Rollback limit is reached.
Timeout (secs)

Specify a time. The exporter will timeout each export operation after the specified number of seconds. If an export operation does time out, the operation will be abandoned and an error will be reported. Each record that is timed out is considered to be a single error when assessing whether the entire export should be stopped and/or rolled back - see Error Before Rollback below.

If no value is specified, the duration of an export operation is unlimited.

Datasource Instance ExpressionIf you have more than one datasource, enter an expression to list multiple target datasource instances for this exporter. Each datasource instance is identified by a unique string. This expression should evaluate to a string which allows the database exporter to determine the instance to use. If no value is entered then the exporter will assume that there is only one instance and will use that one by default. If there is more than one instance and no expression is provided here then an error will be thrown during analysis.
Log Traffic

Insert excerpt
_checklog_box_unticktraffic2
_checklog_box_unticktraffic2
nopaneltrue
 means logging will be turned off for this Database Exporter. This will still be logged if Log Database Exporter Statements is set in System Logging

  • Log Database Exporter Statements: when ticked, PhixFlow always logs details of connections made by database exporters, whatever is set here.

Insert excerpt
_checklog_box_tickedtraffic1
_checklog_box_tickedtraffic1
nopaneltrue
 means the details of the request and response will be logged for this Database Exporter. Details will not be logged if Allow Logging is not ticked in System Logging.

Altering the logging setting requires the Modify System Logging Configuration privilege.

Anchor
databaseExporterQueryString
databaseExporterQueryString

Database Exporter

Statement Expression Query String

Query String Description

Database Exporters can be set up to update a database using either a plain SQL query or PL/SQL. If you want to use PL/SQL tick the box Uses PL/SQL.

Plain SQL Query

Enter an insert or update query just as you would to edit the database directly. The only important difference when creating queries in PhixFlow is that expressions must be enclosed in curly braces i.e. start with a '{' and end with '}'. This includes referencing an attribute name from the input Stream - which is just a simple expression.

Note that when exporting a stream attribute field of Type 'TrueFalse', this attribute must be mapped correctly to the corresponding underlying database column it is being exported into. An 'if' style expression should be used in the query string to transform the TrueFalse value into the appropriate database column. Mapping to a single character database column - {if(aTrueFalseStreamAttribute,'T','F')}
Mapping to a single numeric database column - {if(aTrueFalseStreamAttribute,1,0)}

The other difference between entering a query in PhixFlow and submitting a query to the database directly is that you must not include the standard SQL statement terminator, e.g. a semi-colon, at the end of your query.

insert into INTL_CODES values({Operator}, {Code})

If we have an input Stream containing a list of international codes, with attributes Operator and Code, this query will insert all data from the input Stream into a database table INTL_CODES.

insert into INTL_CODES values({Operator}, {Code}, {toDate(StartDate)})

The table INTL_CODES also includes a date, START_DATE, which we will populate using the attribute StartDate in the input Stream. In the Stream, this date is stored as a string in the format YYYYMMDD. In the query above, we have converted this string to a date using a PhixFlow expression.

insert into INTL_CODES values({Operator}, {Code}, to_date({StartDate}, 'yyyymmdd'))

This query performs the same inserts as the previous example, but in this case we are converting the StartDate string value to a date using a database function.

PL/SQL

As for a plain SQL query, you can write PL/SQL just as you would to run against the database directly - but putting any PhixFlow expressions in curly braces i.e. starting with a '{' and ending with '}'.

Code Block
declare v_count number;
begin
	select count (1)
	into v_count
	from INTL_CODES
	where operator = {Operator} and code = {Code};
 
	if (v_count = 0)
	then
		insert into INTL_CODES
		values( {Operator}, {Code}, {toDate(StartDate)} );
	else
		update INTL_CODES set START_DATE = {toDate(StartDate)}
		where operator = {Operator} and code = {Code};
	end if;
end;

In this example we are loading international codes into a database, as in the plain SQL examples. Here, we want to insert a new value if no value already exists in the table for the Operator/Code combination; otherwise we will edit the existing record with the START_DATE of the current value that we have read from the input stream.