Database Exporter
- Former user (Deleted)
- Anthony George
- Dominic Sandhu (Unlicensed)
This page is for data modellers who want to export data to a different database.
Overview
A database exporter allows data held in a table to be written to a database outside your PhixFlow instance. To do this, you must have a datasource configured to communicate with the database.
To add a new database exporter to an analysis model:
- Go to the model's toolbar → Create group.
- Click Database to expand the menu.
- Drag a Database Exporter onto the analysis model.
To add an existing database exporter to an analysis model, in the model diagram toolbar:
- Go to the model toolbar → List group.
- Click Database to expand the menu.
- Click Database Exporter to open the list of available database exporters.
- Drag a database exporter into the analysis model.
Property Pane Toolbar
For information about the toolbar options, see the Common Properties page, Toolbars and Controls section.
Properties Tab
Parent Details
If this item is within or belongs to another, it's parent name is shown here. See the Common Properties page, Parent Details section for more details.
Basic Settings
Field | Description |
---|---|
Name | Enter the name of the database exporter. |
Enabled | Tick when the configuration is complete and the database exporter is ready to be used. |
Datasource | Select a datasource (database) to which this database exporter will write. |
Statement Expression | Enter an expression to generate the query that writes data to the target datasource; see Statement Expression Query String, below, for details. To select from a list, click Show the List of Datasources. |
PL/SQL | Tick to use a PL/SQL anonymous block to write to the target datasource. |
Inputs
A list of input pipes for this Database Exporter.
This section has a toolbar with Show a Detailed List and Refresh; see standard buttons.
The grid contains a list of input pipes that connect to this datasource.
Input Multiplier
Field | Description |
---|---|
Input Multiplier | Enter an expression that multiplies the data. This works in the same way as the Input Multiplier Expressions on tables and File Exporters. A complete export is carried out for each value in the list of values returned by this expression. If the expression evaluates to an empty list, then no export will be carried out. If no expression is set, a single export will be carried out. |
Analysis Models
If this item is used by an analysis model, its name is listed here. See the Common Properties page, Analysis Model section for more details.
Advanced
Field | Description | Notes |
---|---|---|
Fields to Update in Source Table | ||
Record Export Time | Optionally, select an attribute from the input table. When the exporter writes a record to the target datasource, the selected attribute in the input record will be updated with the export time. | If a record in the input table:
If a record fails to export then the selected attribute will still be updated. This happens even if the export transaction in the target datasource is rolled back. |
Record Export Status | Optionally, select an attribute from the input table. When the exporter writes a record to the target datasource, the selected attribute in the input record will be updated with the export status: either success or fail. | |
Record Response | Optionally, select an attribute from the input table. When the exporter writes a record to the target datasource, the selected attribute in the input record will be updated with any export error messages. | |
Exporter Performance Settings | ||
Parallel Exporters | Optionally specify that the exporter should use a number of parallel exporters. Each buffer of records read from the input pipe is divided across the parallel exporters. The restriction applied to this is that if Group By fields are specified in the input Pipe to the exporter, all records with the same Group By key will be exported by the same exporter. (This is to help reduce contention in the target datasource, writing to tables or calling APIs.) If no value is specified here then only a single exporter will be used. The Commit Size (see below) is per exporter. This means each exporter will commit separately whenever it exports the number of records specified as the commit size | |
Errors Before Rollback | Optionally set the maximum number of errors found, while exporting across all exporters, before the exporter will rollback all uncommitted transactions from all exporters. Up to this number of errors, any errors reported from the exporter will appear as warnings in the log entries. The export will not be considered failed unless the number of errors is above this threshold. | |
Commit Size | Enter the maximum size. The exporter to commit records in batches of the specified size. If no value is specified, all records will be committed as a single batch. | |
Errors Before Stopping | Optionally set the maximum number of errors found, while exporting across all exporters, before the exporter will stop attempting to export records to the Datasource. This is useful where you have set Errors Before Rollback (also configured on this tab), but wish to continue to try to export records to get view of how successful the remaining export would have been. Any records that the exporter attempts to export after the Errors Before Rollback limit has been passed will always be rolled back, even if they are successful. If no value is entered, the exporter will stop when the Errors Before Rollback limit is reached. | |
Timeout (secs) | Specify a time. The exporter will timeout each export operation after the specified number of seconds. If an export operation does time out, the operation will be abandoned and an error will be reported. Each record that is timed out is considered to be a single error when assessing whether the entire export should be stopped and/or rolled back - see Error Before Rollback below. If no value is specified, the duration of an export operation is unlimited. | |
Datasource Instance Expression | If you have more than one datasource, enter an expression to list multiple target datasource instances for this exporter. Each datasource instance is identified by a unique string. This expression should evaluate to a string which allows the database exporter to determine the instance to use. If no value is entered then the exporter will assume that there is only one instance and will use that one by default. If there is more than one instance and no expression is provided here then an error will be thrown during analysis. | |
Log Traffic | When system logging → Log Database Exporter Statements is ticked, PhixFlow always logs details of connections made by database exporters, whatever is set here; see System Logging Configuration. To change this property, you must have the Modify System Logging Configuration privilege. Untick to prevent logging. Ticking this box has no effect when the system logging option is ticked. Tick to log details of communication. Ticking this box has no effect when the system configuration option Allow Logging is not ticked. |
Description
We recommend that you always enter a description to explain the purpose of this item.
Audit Tab
Audit Summary
See the Common Properties page, Audit Summary section.
Statement Expression Query String
Query String Description
Database Exporters can be set up to update a database using either a plain SQL query or PL/SQL. If you want to use PL/SQL tick the box Uses PL/SQL.
Plain SQL Query
Enter an insert or update query just as you would to edit the database directly. The only important difference when creating queries in PhixFlow is that expressions must be enclosed in curly braces i.e. start with a '{' and end with '}'. This includes referencing an attribute name from the input table, which is just a simple expression.
Note that when exporting a attribute field of Type 'TrueFalse', this attribute must be mapped correctly to the corresponding underlying database column it is being exported into. An 'if' style expression should be used in the query string to transform the TrueFalse value into the appropriate database column. Mapping to a single character database column - {if(aTrueFalseAttribute,'T','F')}
Mapping to a single numeric database column - {if(aTrueFalseAttribute,1,0)}
The other difference between entering a query in PhixFlow and submitting a query to the database directly is that you must not include the standard SQL statement terminator, e.g. a semi-colon, at the end of your query.
insert into INTL_CODES values({Operator}, {Code})
If we have an input table containing a list of international codes, with attributes Operator and Code, this query will insert all data from the input table into a database table INTL_CODES.
insert into INTL_CODES values({Operator}, {Code}, {toDate(StartDate)})
The table INTL_CODES also includes a date, START_DATE, which we will populate using the attribute StartDate in the input table. In the table, this date is stored as a string in the format YYYYMMDD. In the query above, we have converted this string to a date using a PhixFlow expression.
insert into INTL_CODES values({Operator}, {Code}, to_date({StartDate}, 'yyyymmdd'))
This query performs the same inserts as the previous example, but in this case we are converting the StartDate string value to a date using a database function.
PL/SQL
As for a plain SQL query, you can write PL/SQL just as you would to run against the database directly - but putting any PhixFlow expressions in curly braces i.e. starting with a '{' and ending with '}'.
declare v_count number; begin select count (1) into v_count from INTL_CODES where operator = {Operator} and code = {Code}; if (v_count = 0) then insert into INTL_CODES values( {Operator}, {Code}, {toDate(StartDate)} ); else update INTL_CODES set START_DATE = {toDate(StartDate)} where operator = {Operator} and code = {Code}; end if; end;
In this example we are loading international codes into a database, as in the plain SQL examples. Here, we want to insert a new value if no value already exists in the table for the Operator/Code combination; otherwise we will edit the existing record with the START_DATE of the current value that we have read from the input table.
Learn More
For links to all pages in this topic, see Analysis Models for Batch Processing Data.
Terminology changes in progress
As part of the redesign of PhixFlow, we are changing the following terms:
dashboard → screen
stream → table
stream attributes → attributes
stream item → record
stream set → recordset
stream view → view
stream item action → record-action
stream action → table-action
driver class → database driver