The following is a list of all key terms within PhixFlow and links to relevant help pages.
Term | Description |
---|---|
Action Button | Used to group Stream Actions into drop-down lists or display single Stream Action as a button. In both cases Action Buttons are displayed on the top of Stream View . |
Aggregate Attribute | Aggregate Attributes define the aggregated properties that are available when data is read from an aggregating Pipe. |
Alarm | Alarms are generally used to alert users to problems that need to be investigated but can also be used to indicate (positively) that there are no problems. |
Alarm Generation Rule | Specifies when an alarm will be generated by an Alarm Generator. |
Alarm Generator | Alarm Generators contain the configuration that defines the checks that PhixFlow carries out on collected and analysed data. They define the conditions which cause an Alarm. to be raised. |
Alarm Priority | Priority code that can be assigned to alarms to prioritise which are tackled first. |
Alarm Priority Rule | Rule used to determine the Alarm Priority that will be assigned to a generated alarm. |
Alarm Status | Status code that can be assigned to alarms to allow it to be tracked through the alarm management lifecycle as and when it`s status changes. |
Alarm Status Rule | Rule used to determine the Alarm Status that will be assigned to a generated alarm. |
Analysis Model | Allow users to logically represent a business process / control flow using configured PhixFlow objects, e.g. streams, file collectors etc. |
Analysis Task | The Analysis Engine is initiated whenever the generation of a Stream Set or alarm is requested. Users can do this by configuring Analysis Tasks in a Task Plan. |
Arc | In most Workflow models, Arcs simply pass Tokens from the Transition to an output Place, however dashed lines represent Arcs with Guards. Output Arcs can implement Guards. These are filter conditions which act to stop Tokens being added to Places if they evaluate to false. |
ArchiveTask | Archives Stream Sets from Streams with archiving configured on them. |
Array | And object that contains an ordered list of other objects. |
Attribute Function | PhixFlow provides an extensive list of Attribute Functions which may be used within all expression strings. |
Audit Detail | Details of an individual audited action performed on a PhixFlow object. |
Audit Summary | A summary of all audited action performed on PhixFlow objects. |
Audited Item | PhixFlow objects that have auditing switched on to allow for any changes in the system to be tracked. |
Automatic Transition Node | These Transitions do not have any associated tasks/processes. They execute immediately and typically are used to split a workflow into two or more strands. |
Candidate Set | Candidate sets are a fundamental concept of function calculation in PhixFlow. Every time a function calculation is carried out, all the required input data is brought together and organised into sets of data - one set for each Key group. The Key groups are worked out using the Pipe Grouping Attributes defined on input Pipe for each Stream. |
PhixFlow Database | The database that stores the PhixFlow schema and all processed data |
Chart | A Chart displays data in table form, or in one of a number of graphs styles. |
Closure Reason | When alarms are set to a status of Closed, a closure reason can be specified to identify the reason why alarms are closed. |
Comparator | Refers to the comparator types that can be used in a filter, e.g. greater than, starts with. |
Collector | A generic term for a File Collector, a Database Collector, or an Http Collector. |
Context Menu | A context menu appears above each Model Element when the user hovers over it in a model view. This menu shows all of the actions that can be performed on that element. |
Control | Primarily a "Requirements Specification" describing a process, either wholly or partially implemented in PhixFlow, to provide revenue assurance and detect potential problems in systems and operations. |
Control Framework | A structured classification of controls that should be put in place to monitor data throughout the enterprise. |
Control Group | These are primarily used as a grouping mechanism. Controls can be grouped according to structures within the organisation, by functional area or some other criteria. |
CSV File | A CSV (comma-separated variable) file is a plain text file where each line in the file consists of a number of fields separated by commas ','. If fields may contain spaces, commas or double quotes, they should be enclosed in double quotes '"'. A double quote may be represented by 2 double quotes. A CSV file is always encoded using a specific character set. On Windows this is typically Cp1282, and on Unix/Linux US7ASCII, though the following are sometimes used: UTF-8, UTF-16, ISO-8859-1. A CSV file typically has an extension '.csv'. A CSV file can be created directly in a plain text editor (e.g. Wordpad/Notepad), in Excel (by saving as a CSV), or by some other application. |
Cycle | Whenever an analysis model is run it will carry out a number of tasks e.g. read a number of files, generate a number of stream sets, raise a set of alarms, export a number of files. It is possible to configure an analysis model so that in a single analysis session it will actually run several times. When a model is configured in this way, each time it runs is termed a 'cycle' and all of the items processed or generated during that cycle (i.e. the files processed, streamsets created, alarms raised, files exported etc) are flagged with the cycle number. This cycle number is visible in the Log File for that analysis run on each of the relevant tabs showing files, alarms and streamsets processed. The most common reason for configuring a model in this way is to allow a number of files to be processed in a single analysis session but to have the results from each file saved in a separate streamset. In this way, whenever a file fails to process the user can simply rollback the streamsets associated with that file rather than rolling back all of the streamsets associated with all of the files for the whole analysis session. To achieve this, a model is configured with a push pipe from the final stream in the model pointing back to itself. This push pipe is then given positive offsets. The file collector at the start of the model is then configured to process a single file at a time. When this model is run the file collector will read in one file, process it to create a stream set (or a chain of stream sets if there are several streams in the model) and will flag the file processed, the stream sets generated and any associated alarms or output files with the cycle number 1. When the last stream in the model has finished producing its stream set it will send a notification to itself down the push pipe. This notification (because of the positive offsets) will be taken as a signal to create another stream set which will effectively cause the whole model to run again. This second run of the model will be deemed to be cycle 2 and all files processed, stream sets produced, alarms raised or files output will be flagged with this cycle number. When later viewing the log files the user can see which items were processed in which cycle and can choose how many cycles to roll back. In the case where the last file processed was unsuccessful the user can therefore choose to only roll back the last cycle. |
Dashboard | A dashboard allows the user to gather into a single view a set of charts and filtered lists of alarms and tasks. This grouping of related reports and statistics provides the user with preconfigured overviews of selected areas, making all the data that is required to view and monitor the health of the system both accessible and easy to read. |
Data Grid | A Data Grid is a particular type of Chart which displays the data in a table format. |
Database Collector | Used to gather information from a external database, defined as a PhixFlow Datasource. It holds the SQL that will be sent to the datasource to retrieve data. |
Database Driver | The driver used to access the source database. |
Database Exporter | Used to write information from PhixFlow to external databases. |
Datasource | Defines a connection to an external database. |
Datasource Instance | Allows for multiple connections for individual database instances to be configured for a specific Datasource. |
DirectedPipe | A Directed Pipe is used as a secondary input to a Merge Stream. Primary Pipes to a Merge are implemented by reading all input records from all Primary Pipes in a single operation, ordered by the Group-By fields. The primary pipes determine the key values of the Candidate Sets that will be generated. A Directed pipe is implemented after the Primary Pipes have run, by reading records matching each Candidate Set grouping key in turn. This can result in a significant performance improvement in certain circumstances. A Pipe can be marked as a Directed Pipe on the Pipe Details / Advanced tab. |
Drill Down | A user can drill down into an alarm to view the recordset that caused the alarm to be generated. Users can drill down through successively more detailed reports from the original data item which raised the alarm through to the source data at the root of the problem. |
Email Task | An Email Task is generated when an Email Transition is triggered. It allows for specified data to be automatically emailed out to a list of recipients. |
Email Transition | An Email Transition generates an Email task to be processed. The details of the email (subject, recipients, message body, attachment) are configured using the Email Transition. |
Expression | An expression returns a value, usually by manipulating or transforming input data, although some expressions can be very simple: e.g. the expression "hello" always returns the string "hello" and requires no input. |
External Task | An External Task is generated when an External Transition is triggered, placing a task on a Process Queue for external processing. |
External Transition Node | An External Transition generates an External task to be processed. The details of the external task (process queue, data expression) are configured using the External Transition. |
File Collector | Describes the structure, content, naming patterns and location of files of data to be imported into PhixFlow. |
File Collector Attribute | The data columns that a File Collector expects to be present when importing a file. |
File Exporter | Used to write information from PhixFlow to external files. |
Filter | Filters are made up of a set of clauses; each clause in turn contains a number of conditions. These conditions must be satisfied for data to be passed through a Pipe or returned in a filtered list. |
Form Bar | If you move your mouse pointer to the bottom of the screen the PhixFlow Form Bar will appear. Contains links to all open forms (both open and minimised). |
FTP Site | A file system that is reachable via the FTP protocol. |
Group By Attribute | Groups the pipe data into candidate set, forms key for condidate set. However if data is coming from database or file collector, order by fields need to be specidied in the collector sql. |
HTTP | Hypertext transfer protocol - the language of the internet. In the context of PhixFlow, it means using standard internet communication as means of importing into and exporting data from a Control. |
HTTP Collector | Collect data from an HTTP Datasource. |
HTTP Datasource | A web site, web service or some other target that allows XML, HTML or text data to be collected from it, or exported to it, over HTTP. Data is collected from an HTTP Datasource using a HTTP Collector , and exported to an HTTP Datasource using a HTTP Exporter. An HTTP Datasource Instance must define at least one HTTP Datasource Instance. An HTTP Datasource Instance may define multiple XML Namespaces to support XPath expressions on HTTP Collectors . |
HTTP Datasource Instance | An HTTP Datasource Instance provide a default username, password and url to an HTTP Collector or HTTP Exporter. |
HTTP Exporter | Export data to an HTTP Datasource. |
Internal variables | Pre-defined variables for use in PhixFlow Expressions. |
Language | The display language of the user. |
Log File | Definition |
Log Message | Within each log list, log messages are output by the Analysis Engine as it performs analysis. |
Macro | A macro is a user-defined function that can be called from other Expressions. A macro can be called with parameters. These parameters are available within the body of the macro as $arg. |
Measure Type | Measurement Types are uused to evaluate a type of score that can be assigned to an alarm or control. |
Measurement | An instance of a value of a measurement type at a given point in time |
Model Element | The individual objects that make up an Analysis Model e.g. streams, file collectors etc. These elements are added via drag and drop controls and context menus. |
Paging | Where a list is displaying more data than can be viewed in the display grid, paging allows the user to scroll through the whole list, e.g. alarm lists, log messages, Stream Set data. |
Period | This can be either a regular period, or variable. There are three possible settings:
|
Pipe | A pipe is a connector that links two elements in a PhixFlow model. |
Place Node | All Workflow models have a Start Place. The PhixFlow Analysis Engine places a Token here when a new alarm is raised and so initiates the workflow. |
Privilege | Privileges are used to define what actions can be performed in PhixFlow. |
Process Queue | Process Queues are referenced by External Workflow Transitions. They represent a queue of jobs to be done by a specific process external to PhixFlow. That process will pick up jobs from the Process Queue. |
Recordset | A record set is a list of records returned, usually by a pipe, where each record consists of named attributes, each with a value. |
Refresh Action | Represents an action that once completed will cause the enclosing view to be refreshed. |
Report Task | Report Tasks can be configured to distribute key information to a list of recipients. Users can do this by configuring Report Tasks in a Task Plan. |
Role | Roles represent the type of access required to do specific tasks (e.g. Designer, Operator etc.) Roles are assigned to User Groups (or vice versa) |
Rollback | Deletes a specified set of Stream Sets from a Stream and updates the tidemarks for that Stream. |
Rollback Task | A Rollback Task can be configured to rollback one or more streams. Users can do this by configuring Rollback Tasks in a Task Plan. |
Shortcut | Links to objects from the Control Framework. |
Sequence | A Sequence is used to generate a numeric sequence of values. The Sequence lets you define the properties of the sequence e.g. the starting value. The nextValue function returns the next value in the sequence each time it is called. The curValue function returns the most recent value returned by nextValue. |
SQL | SQL query used to retrieve data from, or SQL or PL/SQL statement used toupdate data in, a Datasource. |
Stream | Represent a structured store of data within PhixFlow receiving data from one or more components, processing it then storing it. |
Stream Action | Defines a set of updates to the values of a Stream Item. A Stream Action can be applied from a Stream View from a Stream Item Details form. |
Stream Action Attribute | Each Stream Action Attribute defines the new value to be assigned to a single field of a Stream Item. The new value is defined as an Glossary / Index, which can refer to the original values of the Stream Item. When a Stream Action is applied, each attribute's expression is evaluated in turn and the result is saved back to the Stream Item. |
Stream Attribute | Define the type, precision and expressions of how each individual attribute is calculated based on the inputs to the Stream. |
Stream Function | Defines the basic Stream operation. Form the basis of transformations in PhixFlow to get from raw data to the elements of a control framework - controls, reports and alarms. |
Stream Item | Individual record in a Stream Set. |
Stream Item Details | A Stream Item Details form can be opened from a Grid Stream View by opening the view, selecting a Stream Item then selecting Show Stream Item Details from the pull-down menu. |
Stream Tables | Physical table in PhixFlow schema that stores the Stream data. |
Stream View | Used to customise the display of data in one or more Stream Sets. |
Stream View Attribute | Stream Attributes used in a Stream View. Each attribute has a role that defines how the data will be displayed. |
Stream Set | A set of data items for a given period. |
System Configuration | Where all general system wide properties are configured in PhixFlow. |
System Task | A System Task runs standard system-wide housekeeping activities. It must be run as part of a Task Plan. The standard housekeeping activities are:
The parameters controlling how long to keep Log Messages / Managed Files before deleting them, and whether first to archive Log Messages to file, are set in System Configuration. |
Task | A Task can be set up to carry out an action or set of actions in PhixFlow. Tasks can be one of:
|
Task Plan | Contains a set of Tasks to run. |
Timer Task | An Timer Task is generated when a Timer Transition is triggered. |
Timer Transition Node | A Timer Transition generates a timer task to be processed. The details of how the timer task will be generated are configured in the Timer Transition. |
Token | Tokens store the state of the workflow. A transition of a workflow may fire whenever there is a token at the start of all input arcs; when it fires, it consumes these tokens, and places tokens at the end of all output arcs. |
User | Users represent the individuals working in the organisation that will be logging onto PhixFlow and performing tasks. Users are assigned to User Groups. |
User Group | Users belong to User Groups that represent the area of the business they work in (e.g. Customer Service, Finance, Billing Support etc.) User Groups are assigned to Roles (or vice versa). |
User Notification Rule | Specifies the User that will be notified of generated alarms or workflow tasks based on a set of rules. |
User Task | A User Task is generated when a User Transition is triggered, sending a task to a specific user for manual processing. |
User Transition Node | A User Transition generates a user task to be processed. The details of how the user taks will be generated are configured in the User Transition. |
Workflow Assignment Rule | When configuring an Alarm Generator, the user can specify the Workflow Model that generated alarms will be passed to using a set of rules. |
Workflow Model | Workflow Models are the graphical representation of manual and automated tasks that should be completed to resolve an alarm. |
Workflow Node | Any node on a workflow model, i.e. a place or a transition |
Workflow Status | When an Workflow Task is generated, a status can be assigned to it. This allows for the task to be tracked through the task management lifecycle as and when it`s status changes. |
XML Namespace | When parsing an XML response, the XPath on an HTTP Collector determines the node in the XML response that will cause an output record to be generated. In general, XML node names are relative to namespaces, and are referred to by namespace aliases. An XPath can use any XML Namespace alias defined by the HTTP Datasource linked to the HTTP Collector . |
XPath | An XPath expression on an HTTP Collector determines the node in the XML response that will cause an output record to be generated. An XPath is a '/'-separated list of node names. If any of the node names in the XPath use non-default namespaces, these may be defined in XML Namespaces |