Insert excerpt | ||||||||
---|---|---|---|---|---|---|---|---|
|
New BPMN Workflow and Action Flow Definitions
termbpmn_workflowtermbpmn_workflowTerms A to M
stream attribute: see attribute
stream view attribute: see attribute
stream action attribute: see attribute
Terms M to Z
Defines a set of operations on the values of data in a stream.
stream item. A stream action can be applied from the stream view properties.
Used to group actions into drop-down lists or display a single 9711226488 as a button. In both cases Action Buttons are displayed on the top of 9711226488 .
Aggregate Attributes define the aggregated properties that are available when data is read from an aggregating 9711226488.
Allow users to logically represent a business process / control flow using configured PhixFlow objects, e.g. streams, file collectors etc.
The Analysis Engine is initiated whenever the generation of a Stream Set is requested. Users can do this by configuring Analysis Tasks in a 9711226488.
A task to archive 9711226488 from 9711226488 with archiving configured on them.
An object that contains an ordered list of other objects.
PhixFlow provides an extensive list of Attribute Functions which may be used within all expression strings.
Details of an individual audited action performed on a PhixFlow object.
A summary of all audited action performed on PhixFlow objects. Can be viewed in the System Console.
PhixFlow objects that have auditing switched on to allow for any changes in the system to be tracked.
Candidate sets are a fundamental concept of function calculation in PhixFlow.
Every time a function calculation is carried out, all the required input data is brought together and organised into sets of data - one set for each Key group.
The Key groups are worked out using the Pipe Grouping Attributes defined on input Pipe for each Stream.
The database that stores the PhixFlow schema and all processed data
A Chart displays data in table form, or in one of a number of graphs styles.
Refers to the comparator types that can be used in a filter, e.g. greater than, starts with.
A context menu appears above each 9711226488 when the user hovers over it in a model view. This menu shows all of the actions that can be performed on that element.
A CSV (comma-separated variable) file is a plain text file where each line in the file consists of a number of fields separated by commas ','. If fields may contain spaces, commas or double quotes, they should be enclosed in double quotes '"'. A double quote may be represented by 2 double quotes.
A CSV file is always encoded using a specific character set. On Windows this is typically Cp1282, and on Unix/Linux US7ASCII, though the following are sometimes used: UTF-8, UTF-16, ISO-8859-1.
A CSV file typically has an extension '.csv'.
A CSV file can be created directly in a plain text editor (e.g. Wordpad/Notepad), in Excel (by saving as a CSV), or by some other application.
The most common reason for configuring a model in this way is to allow a number of files to be processed in a single analysis session but to have the results from each file saved in a separate stream set. In this way, whenever a file fails to process the user can simply rollback the stream sets associated with that file rather than rolling back all of the stream sets associated with all of the files for the whole analysis session.
To achieve this, a model is configured with a push pipe from the final stream in the model pointing back to itself. This push pipe is then given positive offsets. The file collector at the start of the model is then configured to process a single file at a time. When this model is run the file collector will read in one file, process it to create a stream set (or a chain of stream sets if there are several streams in the model) and will flag the file processed, the stream sets generated and any associated output files with the cycle number 1. When the last stream in the model has finished producing its stream set it will send a notification to itself down the push pipe. This notification (because of the positive offsets) will be taken as a signal to create another stream set which will effectively cause the whole model to run again. This second run of the model will be deemed to be cycle 2 and all files processed, stream sets produced or files output will be flagged with this cycle number.
When later viewing the log files the user can see which items were processed in which cycle and can choose how many cycles to roll back. In the case where the last file processed was unsuccessful the user can therefore choose to only roll back the last cycle.
A dashboard allows the user to gather into a single view a set of charts. This grouping of related reports and statistics provides the user with preconfigured overviews of selected areas, making all the data that is required to view and monitor the health of the system both accessible and easy to read.
A Data Grid is a particular type of view which displays the data in a table format.
Used to write information from PhixFlow to external databases.
Defines a connection to an external database.
Allows for multiple connections for individual database instances to be configured for a specific 9711226488.
A Directed Pipe is used as a secondary input to a Merge Stream.
Primary Pipes to a Merge are implemented by reading all input records from all Primary Pipes in a single operation, ordered by the Group-By fields. The primary pipes determine the key values of the Candidate Sets that will be generated.
A Directed pipe is implemented after the Primary Pipes have run, by reading records matching each Candidate Set grouping key in turn. This can result in a significant performance improvement in certain circumstances.
A Pipe can be marked as a Directed Pipe on the Pipe Details / Advanced tab.
An expression returns a value, usually by manipulating or transforming input data, although some expressions can be very simple: e.g. the expression "hello" always returns the string "hello" and requires no input.
The data columns that a File Collector expects to be present when importing a file.
Used to write information from PhixFlow to external files.
Filters are made up of a set of clauses; each clause in turn contains a number of conditions. These conditions must be satisfied for data to be passed through a pipe or returned in a filtered list.
If you move your mouse pointer to the bottom of the screen the PhixFlow Form Bar will appear. Contains links to all open forms (both open and minimised).
A file system that is reachable via the FTP protocol.
Groups the pipe data into candidate set, forms key for candidate set. However if data is coming from database or file collector, order by fields need to be specified in the collector SQL.
Hypertext transfer protocol - the language of the internet. In the context of PhixFlow, it means using standard internet communication as means of importing and exporting data.
A web site, web service or some other target that allows XML, HTML or text data to be collected from it, or exported to it, over HTTP. Data is collected from an HTTP Datasource using a 9711226488 , and exported to an HTTP Datasource using a 9711226488.
An HTTP Datasource Instance must define at least one 9711226488.
An HTTP Datasource Instance may define multiple 9711226488 to support 9711226488 expressions on 9711226488 .
An HTTP Datasource Instance provide a default username, password and URL to an 9711226488 or 9711226488.
Export data to an 9711226488.
Pre-defined variables for use in PhixFlow Expressions.
The display language of the user.
A record of a task or action completed in PhixFlow, including any messages output and details about stream sets created.
Within each log list, log messages are output by the Analysis Engine as it performs analysis.
A macro is a user-defined function that can be called from other 9711226488.
A macro can be called with parameters. These parameters are available within the body of the macro as $arg.
The individual objects that make up an 9711226488 e.g. streams, file collectors etc. These elements are added via drag and drop controls and context menus.
Where a list is displaying more data than can be viewed in the display grid, paging allows the user to scroll through the whole list, e.g. log messages, Stream Set data.
A period is set on a Stream. This can be either a regular period, or variable. There are four possible settings:
- Daily: generate or collect data every day.
- Monthly: generate or collect data every month.
- Variable: generate or collect data since the more recent run of the stream to the current date.
- Transactional: allows multiple users to run independent analysis tasks at the same time.
A pipe is a connector that links two elements in a PhixFlow model.
Privileges are used to define what actions can be performed in PhixFlow; see Privileges.
A record set is a list of records returned, usually by a pipe, where each record consists of named attributes, each with a value.
Represents an action that once completed will cause the enclosing view to be refreshed.
Roles represent the type of access required to do specific tasks (e.g. Designer, Operator etc). Roles are assigned to User Groups (or vice versa); see Roles.
Deletes a specified set of 9711226488 from a 9711226488 and updates the tidemarks for that 9711226488.
A Rollback Task can be configured to rollback one or more streams. Users can do this by configuring Rollback Tasks in a 9711226488.
A Sequence is used to generate a numeric sequence of values.
The Sequence lets you define the properties of the sequence e.g. the starting value.
The nextValue function returns the next value in the sequence each time it is called.
The curValue function returns the most recent value returned by nextValue.
SQL query used to retrieve data from, or SQL or PL/SQL statement used to update data in, a 9711226488.
Defines a set of updates to the values of a stream item. A stream action can be applied from the stream view properties.
Defines the basic Stream operation. The type of stream may be aggregate, calculate, calculate by set, cartesian or merge.
To do
transaction/transactional updates
analysis model
dashboard
grid - phixflow word for a table. You can show stream data in a grid (looks like an excel spreadsheet). Property tabs can also have grids to represent property information, for example for the list of users in a user group
stream attribute → attribute
stream view attribute → attribute
stream view → view
property tabs
item - anything that appears in the repository
modelling object - things that appear in analysis models
layout component - anything that can make up a layout
layout - a GUI screen in your application
dashboard = layout the starting point for a layout
application interface, screen, form, layout screen - too generic "when you look at your computer screen"
publish - The process of writing changes to stream data in a model to the stream tables in the database