Primeur Online Docs
Data Mover 1.20
Data Mover 1.20
  • 🚀GETTING STARTED
    • What is Primeur Data Mover
    • Main features of Primeur Data Mover
    • Primeur Data Mover deployment
    • Navigate through Primeur Data Mover
  • 📚INFRASTRUCTURE
    • STENG, clusters and servers
      • Adding a cluster and a STENG
      • Deleting a STENG
    • DMZ Gateways
      • Blocking users automatically at the DMZ level 🚀
    • DMZ Clusters
      • Load balancing across active DMZ clusters for outbound file transfers 🚀
  • 👥Actors
    • Who are the actors
    • Create your first actor
    • Configure an actor 🚀
      • Users Tab
      • Groups Tab
      • VFS Tab
      • File Resource Tab
      • Connection Contract Tab
      • Client Connections Tab
    • Search files by actor
    • Actor Lineage 🚀
      • Aggregation of flows by protocol 🚀
      • Lineage with connection contracts 🚀
      • Lineage with input, mediation and output contracts 🚀
      • Lineage with any contract type 🚀
  • 🗄️VIRTUAL FILE SYSTEMS
    • Virtual File Systems (VFS) 🚀
      • Creating a VFS 🚀
      • Configuring a VFS
      • Adding Virtual Paths
      • Modifying and Deleting a VFS
    • Searching files in all VFS
    • Storage Classes 🚀
      • Storage Class: SMB v3 or later versions 🚀
      • Storage Class: Azure Blob Storage 🚀
      • Storage Class: Amazon S3 🚀
      • Storage Class: Google Storage 🚀
      • Storage Class: Local File System 🚀
    • Retention Classes
  • 📝Contracts
    • What is a contract
    • Create your first contract
      • Create an Input Contract
        • Define the contract info
        • Associate the contract with the actor
        • Define the contract actions
        • Set the contract variables
      • Create a Mediation Contract
      • Create an Output Contract
      • Create a Connection Contract
        • Create a contract clause
        • Associate the VFS with file processing rules
        • File Processing Rules
    • Managing contracts 🚀
    • File Resources
      • Creating File Resources
      • Navigating File Resources
      • How to use File Resources
  • 🧱Workflows
    • What is a workflow
    • Create your first workflow template
    • Trigger types
      • Trigger types for input contracts
      • Trigger types for mediation and output contracts
    • Service tasks
      • Standard service tasks
      • Triggerable service tasks 🚀
      • Spazio selectors and filebox metadata management
      • Error management
    • Variables
      • Variables in workflows and contracts
      • Handling process variables
    • Workflow templates
      • System workflow templates
        • Workflow templates for input contracts
        • Workflow templates for mediation contracts
        • Workflow templates for output contracts
      • Custom workflow templates
        • Workflow template toolbar
        • Workflow template Shape repository panel
        • Workflow template working area
        • Workflow template BPMN-diagram panel
      • Error workflow templates
    • Editing workflow templates
    • DataFlow Instance Context (DFIC) 🚀
  • 🛸TRANSPORT PROTOCOLS AND CONNECTORS
    • Data Mover client and server roles
    • Client Connections
      • Client Connection: FTP
      • Client Connection: FTPS
      • Client Connection: SFTP
      • Client Connection: HTTP
      • Client Connection: HTTPS
      • Client Connection: PESIT
      • Client Connection: SMB v3 or later versions
      • Client Connection: POP3 or IMAP
      • Client Connection: SMTP
      • Client Connection: PR4/PR4S
      • Client Connection: PR5
      • Client Connection: PR5S
      • Client Connection: HDFS
      • Client Connection: HDFSS
      • Client Connection: Amazon S3 🚀
      • Client Connection: Google Cloud Storage
        • Credentials
      • Client Connection: Azure Blob Storage
      • Client Connection: IBM Sterling Connect:Direct
      • Appendix
    • Server Connections 🚀
      • Server Connection: FTP
      • Server Connection: FTPS
      • Server Connection: SFTP
      • Server Connection: HTTP
      • Server Connection: HTTPS
      • Server Connection: PeSIT
      • Server Connection: PR4
      • Server Connection: PR5
      • Server Connection: PR5S 🚀
      • Server Connection: IBM Sterling Connect:Direct
    • Stopping all servers in one go
  • 💻API
    • HTTP MFT Rest API
    • Job Manager APIs 🚀
    • SFTP Server sessions APIs 🚀
    • Audit Logs APIs 🚀
  • 🔓Security
    • Identity and Access Management
    • Users & Groups
      • Setting the password policy
      • Creating Internal Users 🚀
      • Creating Internal Groups
      • Creating External Users
      • Creating External Groups
    • Key Stores and Trust Stores
      • Key Store 🚀
        • Creating a Key 🚀
        • Creating a Certificate 🚀
        • Importing a Key or a Certificate
        • Creating a Symmetric key
        • Examples
      • Trust Store 🚀
        • Importing Keys 🚀
        • Importing Certificates
      • Untrusted Cache 🚀
      • Trusting Keys and Certificates
      • PGP Key Store and PGP Trust Store
        • PGP Key Store
        • Importing keys into the PGP Trust Store
    • ICAP
      • Configuring ICAP
      • Defining an ICAP rule
  • 🎧FILE EVENT LISTENER
    • What is the File Event Listener
    • Configuring File Event Listeners
      • Setting the File Event Listener Engine
      • Defining a contract for the File Event Listener
      • Setting events to be monitored 🚀
    • RegEx Rules 🚀
    • Monitoring File Event Listeners
  • 👑FILE MANAGER
    • What is the File Manager
    • Logging into File Manager
    • Managing the File Manager 🚀
      • The list of results
      • Creating new folders
      • Uploading files
      • Downloading files 🚀
      • Searching for files and folders
      • Deleting files 🚀
      • Bulk actions 🚀
    • File Manager and VFS
    • Customizing File Manager externals
      • The configuration-wui.json file 🚀
      • How to customize the Login window and the logo
      • How to customize the footer
      • How to configure the Upload with Metadata option
      • How to customize bulk actions 🚀
  • 🧑‍⚖️FILE ROUTING
    • What is File Routing 🚀
    • Routing Rules
      • The Rules tab
      • The Categories tab
      • The Output tab
    • How to create a rule 🚀
      • Add metadata 🚀
      • Select ACTIONS
      • Select OUTPUTS
      • Policy for the selection of metadata rules
    • Configuration of the environment in Data One
      • Set up Storage Classes
      • Set up Retention Classes
      • Configure the Actor
      • Set up File Resources
    • Associate the Routing Rule with a Contract
    • Example
  • 📩NOTIFICATION CHANNELS
    • What are Notification Channels
    • Configuring the default Email Notification Channel
    • Configuring a new Email Notification Channel
    • Trusting Certificates
    • Managing Templates
      • Data Watcher Macros
      • Contract Macros
      • ICAP Macros
      • Central Log Macros
      • Email Templates
      • Editing default templates
      • Loading a new template
  • 💬LOGS & AUDIT
    • Logs 🚀
      • Logs options 🚀
      • Troubleshooting error analysis in Logs
    • Audit Options 🚀
      • Export audit logs 🚀
      • List of Audit entity types 🚀
      • Audit message codes 🚀
    • Log Notifiers 🚀
      • FEL message codes
  • 🕒MONITORING
    • Jobs
      • Details about Jobs 🚀
      • jobman.sh CLI
    • Job Manager
    • Job Queues
      • Managing Job Queues
    • File Transfers
      • Ongoing
      • Finished
      • Reports
    • File Transfers Rules
      • Configuring Rules
  • 🧐HOW TO...
    • ... use different DNS names - NEW! 🚀
    • ... configure a Cron Expression
    • ... configure an application
    • ... customize a header
    • ... run searches in Data Watcher 🚀
    • ... use Data Shaper graphs in Data Mover contracts
    • ... modify DMCFG and deploy it
    • ... tune Data One data retention
    • ... fine tune Data Mover
  • 🗒️RELEASE NOTES
    • Data One 1.20.10
    • Data One 1.20.9
    • Data One 1.20.8
    • Data One 1.20.7
      • Data One 1.20.7.1
    • Data One 1.20.6
    • Data One 1.20.5
    • Data One 1.20.4
    • Data One 1.20.3
    • Data One 1.20.2
    • Data One 1.20.1
    • Data One 1.20.0
Powered by GitBook
On this page
  • System templates
  • Trigger type: On Demand
  • 1. SpExec operation
  • 2. Put file
  • 3. Pull file and optionally archive and/or delete it
  • 4. FEL v2
  • 🚀 Trigger type: onMessage
  • 1. Aggregating JMS messages into files
  • Trigger type: Time
  • 1. Pull file and optionally archive, delete it and/or specify a flag file
  • 2. Pull file on a selected Virtual Path
  • 3. Put file
  • 4. Pull file from a remote directory and optionally archive and/or delete it
  • 5. Pull file from a remote directory and optionally archive, delete it and/or specify a flag file
  • DataFlow templates
  • Trigger type: Time
  • 1. Pull file from a remote directory and optionally archive and/or delete it
  • 2. Pull file from a remote directory
  1. Workflows
  2. Workflow templates
  3. System workflow templates

Workflow templates for input contracts 🚀 HIDDEN

aggiunto Trigger type: On Message con system_onMessage_listener system template.

Last updated 1 day ago

System templates

The following properties are common to all templates. They are also optional and therefore, can be skipped if not needed.

Common properties:

  • Error handling The Enable workflow error handling option can be set to ON to use PRIMEUR system workflows to manage errors. Enabling the toggle switch, the Select error handling template drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft_logAllVariables and ehwft_logAllVariables templates in the section.

  • systemEnableLogDebugToFile Set this toggle button to ON to activate logs

Trigger type: On Demand

The On Demand trigger does not require any mandatory variable. The On Demand triggered workflow templates are triggered by a direct API invocation via REST. This API will accept a map of variables that will be passed to the instance as process variables. Any variable explicitly declared on the template to be requested when creating the Contract is added to the set of variables coming from the REST invocation. If a variable name is used when creating the Contract and is used in the map passed via REST as well, this second one wins over the Contract one.

1. SpExec operation

system_onDemand_exec

Goal: System template that will perform a remote spExec operation when onDemand is triggered.

Variables:

  • systemEnableLogDebugToFile Activate logs. In the input contract, set the toggle button to ON to activate logs.

  • systemErrorHandlingWorkflowTemplateName Enable workflow error handling. In the input contract, setting the toggle button to ON, the Select error handling template drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft_logAllVariables and ehwft_logAllVariables templates in the section.

  • Cluster STeng cluster where the spExec operation is performed.

  • executable Executable command to be run on the remote STeng shell.

  • arguments Arguments that will be passed to the selected executable command.

2. Put file

system_onDemand_putFile

Goal: System template On Demand that operates a "put" from a folder to a virtual path.

Variables:

  • systemEnableLogDebugToFile Activate logs. In the input contract, set the toggle button to ON to activate logs.

  • Cluster Cluster where the operation will be executed.

  • fileName Name of the file that will be uploaded to the specified virtual path.

  • Source file path Path of the file.

  • Actor, VFS, VFS Path Virtual Path where the file will be placed.

  • File resource Resource profile to apply during the file Put operation.

3. Pull file and optionally archive and/or delete it

system_onDemand_pull-archiveOpt-removeOpt

Goal: On-demand triggered System Template that pulls a specified file from a remote directory to the selected destination Virtual Path. Optionally, the pulled file can be archived and/or deleted at the source.

Operation:

  1. Data Mover uses a Client Connection to connect to a remote system.

  2. Data Mover pulls a specific file based on filename and path.

  3. Data Mover optionally removes the original file.

  4. Data Mover optionally archives the pulled file to a selected location, optionally adding a timestamp to filename following this pattern: <original_filename> _yyyyMMddHHmmssSSS. <original_extension>.

When the archive option is selected, the original file is deleted irrespective of the selected remove option.

Variables:

  • systemEnableLogDebugToFile Activate logs. In the input contract, set the toggle button to ON to activate logs.

  • Cluster Cluster where the operation will be executed.

  • remotePathFile Remote path where the file to be pulled is located.

  • Select remote server Client Connection to be used to pull the remote file.

  • fileName Name of the file to be pulled.

  • File resource Resource profile to apply during the pull operation.

  • Actor, VFS, VFS Path Actor, VFS and Virtual Path where the file will be placed.

  • archiveFolder Archiving folder where the pulled file will be moved.

  • archiveRenameWithTimestamp (Required only if the archive folder is set) If enabled, the data file will be archived on the selected folder and renamed with a timestamp (yyyyMMddHHmmssSSS) following this pattern: <original_filename>_yyyyMMddHHmmssSSS.<original_extension>

  • removePulledFile (Required only if the archive folder is NOT set) If enabled, the pulled data file will be removed at the source after being pulled successfully (source files are always deleted when the archive option is selected)

4. FEL v2

system_onDemand_FEL_v2

Goal: OnDemand triggered System Template created to be fed by the new FEL feature. Eligible for dataflow too. It will be invoked by a running FEL instance and will take care to perform the acquisition of a file after FEL has been triggered by it according to the FEL configuration. Will ask only for the destination virtual path and an optional Resource Profile at contract level, all other parameters are managed by the FEL instance itself.

Variables:

  • systemEnableLogDebugToFile Activate logs. In the input contract, set the toggle button to ON to activate logs.

  • Actor, VFS, VFS Path Actor, VFS and Virtual Path where the file will be placed.

  • File resource Resource profile to apply to the file while it is being written on the destination Virtual Path.

🚀 Trigger type: onMessage

1. Aggregating JMS messages into files

system_onMessage_listener

Goal: System template automatically selected when choosing the onMessage trigger type in an Input contract that aggregates JMS messages into files.

Unlike other contract actions, the message-to-file action (M2FAction) is not associated with a workflow template and does not execute workflows. The system_onMessage_listener workflow action is selected automatically when choosing the onMessage trigger type.

Variables:

  • TRANSFER_PROFILE_JMS Select the JMS client connection that will be used to connect to the queue, listen to it, and receive incoming JMS messages as they arrive. In the input contract, this is the Select remote server field.

  • CLUSTER Select the Cluster where the file will be pushed.

  • PARALLELISM The Parallelism parameter in the message to file action of the Input contract defines whether a listener should be activated on each STENG peer.

    • If Parallelism is enabled, a listener is active on each STENG peer and STENG parallelism number indicates the parallelism of each Message Listener. Messages are consumed in parallel, their order is not preserved. Additionally, since thresholds might be reached independently by each listener, messages could be aggregated into files in a non-deterministic manner.

  • JMS_DESTINATION The name of the JMS source queue that the listener is listening to.

  • VFS Actor, VFS and virtual path where the file will be put.

  • FILE_RESOURCES (Optional) Resource profile that will be applied when pushing the file. In the input contract, this variable corresponds to the File Resource field.

  • VIRTUAL_BOX Virtual box associated with the file. In the input contract, this variable is in the Select VirtualBox field. A new virtual box can be created clicking the Select VirtualBox field.

Trigger type: Time

A Time triggered workflow template requires a cron-expression. The instance starts when the time defined in the cron-expression is matched. No variables are added to the flow instance.

Common variables:

1. Pull file and optionally archive, delete it and/or specify a flag file

system_time_flagFile-pull-archiveOpt-removeOpt

Goal: Time-triggered System Template that pulls a specific file from a remote directory to the selected destination Virtual Path. Optionally, a flag file can be specified so that the data file is pulled only when the presence of the flag file on the specified folder has been verified. The pulled data file can optionally be archived and/or deleted at the source.

Operation:

  1. Data Mover uses a Client Connection to connect to a remote system to check the presence of a flag file.

  2. Data Mover uses the same Client Connection to connect to the remote system and pull a specific file based on path and filename.

  3. As an option, the flag file can be removed once the file is successfully pulled.

  4. Optionally, Data Mover removes the original file.

  5. Optionally, Data Mover archives the pulled file to a selected location.

  6. Optionally, Data Mover adds a timestamp to the filename following this pattern: <original_filename> _yyyyMMddHHmmssSSS. <original_extension>.

When the archive option is selected, the original file is deleted irrespective of the selected remove option.

Variables:

  • fileName Name of the data file to be pulled.

  • systemWorkflowInstanceSingletonMode The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager and will start a new instance only when the previous one is terminated. Note that time-triggered workflows have the Single Instance toggle set to true by default.

  • Cluster Cluster where LS and Pull operations will be executed

  • remotePathFile Remote folder where the data file to be pulled is located

  • Select remote server Transfer profile (or Client Connection) to be used to pull the remote file

  • fileName Name of the file to be pulled

  • Actor, VFS, VFS path Virtual Path where the file will be placed

  • File resource Resource profile to apply during the file pull operation

  • archiveFolder If set, the pulled data file will be remotely archived in this folder and deleted from the source after being pulled successfully

  • archiveRenameWithTimestamp (Required only if the archive folder is set) If enabled, the data file will be archived on the selected folder and renamed with a timestamp (yyyyMMddHHmmssSSS) following this pattern: <original_filename>_yyyyMMddHHmmssSSS.<original_extension>

  • removePulledFile (Required only if the archive folder is NOT set) If enabled, the pulled data file will be removed at the source after being pulled successfully (source files are always deleted when the archive option is selected)

  • flagFileName Optionally, a flag file can be specified so that the data file is pulled only when the flag file is in the specified folder. The pulled data file can optionally be archived and/or deleted at the source.

  • remotePathFlagFile (Required only if Flag file name is set) Remote folder where Flag file presence will be checked

  • removeFlagFile (Required only if the Flag file name is set) If enabled, the flag file will be removed after a successful data file pull.

2. Pull file on a selected Virtual Path

system_time_remoteLs-loop-pull

Goal: At regular intervals, this template polls a remote directory and sends each file found to a selected destination Virtual Path. Processed files can be filtered by:

  1. filename

  2. extension

Notes: Directories are skipped. Only the files in the specified remote directory are processed. Data Watcher will show one flow with all the files found by LS and pulled.

Variables:

  • systemWorkflowInstanceSingletonMode The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated. Note that time-triggered workflows have the Single Instance toggle set to true by default.

  • Cluster Cluster where LS and Pull operations will be executed

  • Select remote server Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path)

  • spLs_remotePath Remote path that Ls remote operation will scan looking for files to pull

  • File resource Resource profile to apply during the file Pull operation

  • Actor, VFS, VFS path Actor, VFS and Virtual Path where the file will be placed

  • spLs_fileNameFilter Filter by file name on files from remote LS Note: the filter works in “filename contains word” mode

  • spLs_fileExtensionFilter Filter by file extension on files from remote Ls Note: the filter works in “extension contains word” mode

3. Put file

system_time_putFile

Goal: Time-triggered system template that will operate a "put" from a folder to a virtual path.

Variables:

  • systemWorkflowInstanceSingletonMode The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated. Note that time-triggered workflows have the Single Instance toggle set to true by default.

  • Cluster Cluster where LS and Pull operations will be executed

  • fileName File Name that will be get

  • source Source file path

  • Actor, VFS, VFS Path Actor, VFS and Virtual Path where the file will be placed

  • File Resource Resource profile to apply during the file Pull operation

4. Pull file from a remote directory and optionally archive and/or delete it

system_time_remoteLs-loop-pull-archiveOpt-removeOpt

Goal: At regular intervals, this template polls a remote directory and pulls each file found to the selected destination Virtual Path. After a file is SUCCESSFULLY pulled, if a remote folder for the archive was specified, that file will be MOVED from its original position to the archive folder. Then the process ends. After a file has been SUCCESSFULLY pulled, if a remote folder for the archive was NOT specified and the boolean slider "REMOVE" was set at the contract level, the file will be remotely DELETED. Processed files can be filtered by filename or extension.

Notes: Directories are skipped, only files in the specified remote directory are processed.

Variables:

  • systemWorkflowInstanceSingletonMode The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated. Note that time-triggered workflows have the Single Instance toggle set to true by default.

  • Cluster Cluster where the operation will be executed

  • Select remote server Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path)

  • spLs_remotePath Remote path that Ls remote operation will scan looking for files to pull

  • File resource Resource profile to apply during the file Pull operation

  • Actor, VFS, VFS Path Actor, VFS and Virtual Path where the file will be placed

  • spLs_fileNameFilter Filter by file name on files from remote LS Note: the filter works in “filename contains word” mode

  • spLs_fileExtensionFilter Filter by file extension on files from remote Ls Note: the filter works in “extension contains word” mode

  • archive_folder If set, the file pulled successfully will be remotely archived in this folder

  • spRm_removePulledFile If set, the pulled file will be removed after being pulled successfully, if not archived with the archive_folder variable.

5. Pull file from a remote directory and optionally archive, delete it and/or specify a flag file

system_time_flagFile-remoteLs-loop-pull-archiveOpt-removeOpt

Goal: At regular intervals, this template polls a remote directory. Each file found in the file directory matching optional filters on file name and extension will be pulled to the selected destination Virtual Path. Optionally, a flag file can be specified so that files are pulled only after verifying that the flag file exists in the specified folder. Pulled files can optionally be archived and/or deleted at the source.

Operation:

  1. Data Mover connects to a remote system via a Client Connection to check the presence of a flag file.

  2. Data Mover uses the same Client Connection to connect to a remote system, reads contents of a remote folder and pulls files based on filename and extension filters (embedded folders are ignored).

  3. As an option, the flag file can be removed once the file is pulled successfully.

  4. Data Mover optionally removes original files.

  5. Data Mover optionally archives pulled files to a selected location.

  6. Data Mover optionally adds a timestamp to the filename following this pattern: <original_filename> _yyyyMMddHHmmssSSS. <original_extension>.

When the archive option is selected, original files are deleted irrespective of the selected remove option.

Variables:

  • systemWorkflowInstanceSingletonMode The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated. Note that time-triggered workflows have the Single Instance toggle set to true by default.

  • Cluster Cluster where the operation will be executed

  • Select remote server Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path)

  • spLs_remotePath Remote path that Ls remote operation will scan looking for files to pull

  • spLs_fileNameFilter Filter by file name on files from remote LS Note: the filter works in “filename contains word” mode

  • spLs_fileExtensionFilter Filter by file extension on files from remote Ls Note: the filter works in “extension contains word” mode

  • Actor, VFS, VFS Path Actor, VFS and Virtual Path where the file will be placed

  • File resource Resource profile to apply during the file Pull operation

  • archive_folder If set, the file pulled successfully will be remotely archived in this folder

  • archiveRenameWithTimestamp Applicable only if the archive folder is set. When enabled, the file will be archived in the selected folder and renamed with a timestamp. This pattern is applied to the renamed file: <original_filename>_yyyyMMddHHmmssSS. <original_extension>.

  • removePulledFile Applicable only if the archive folder is not set. If enabled, the pulled file is removed at the source after being successfully pulled (source files are always deleted when the archive option is selected)

  • flagFileName Flag file filename If set, before trying to pull files, the workflow will check the presence of a Flag file with this filename and extension in the selected folder.

  • remotePathFlagFile Flag file folder Applicable only if the Flag file name is set. Remote folder where the Flag file will be verified

  • removeFlagFile Applicable only if the Flag file name is set. If enabled, the Flag file will be removed after successfully pulling the file.

DataFlow templates

These are templates that can be used with DataFlows.

Common variables:

  • systemEnableLogDebugToFile Set this toggle button to ON to activate logs.

  • systemWorkflowInstanceSingletonMode The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated. Note that time-triggered workflows have the Single Instance toggle set to true by default.

Trigger type: Time

A Time triggered workflow template requires a cron-expression. The instance starts when the time defined in the cron-expression is matched. No variables are added to the flow instance.

Common variables:

1. Pull file from a remote directory and optionally archive and/or delete it

df_time_remoteLs-loop-pull-archiveOpt-removeOpt

Goal: This template can be used with DataFlows. At regular intervals, it polls a remote directory and pulls each file to the selected Virtual Path. Once a file is SUCCESSFULLY pulled, if a remote folder for archive is specified, the file will be MOVED from its original position to the archive folder. Then the process ends. Once a file is SUCCESSFULLY pulled, if a remote folder for archive is NOT specified, if the boolean slider "REMOVE" is set at contract level, the file will be DELETED from the remote server. Processed files can be filtered by filename or extension.

Notes: Directories are skipped, only files in the specified remote directory are processed. Data Watcher will show one flow for every file found by LS and pulled.

Variables:

  • Cluster Cluster where LS and Pull operations will be executed.

  • Select remote server Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path)

  • spLs_remotePath Remote path that Ls remote operation will scan looking for files to pull

  • File resource Resource profile to apply during the file pull operation

  • Actor, VFS, VFS path Virtual Path where the file will be placed

  • spLs_fileNameFilter Filter by file name on files from remote LS Note: the filter works in “filename contains word” mode

  • spLs_fileExtensionFilter Filter by file extension on files from remote Ls Note: the filter works in “extension contains word” mode

  • archive_folder If set, the file pulled successfully will be remotely archived in this folder

  • spRm_removePulledFile If set, the pulled file will be removed after being pulled successfully, if not archived with the archive_folder variable.

2. Pull file from a remote directory

df_time_remoteLs-loop-pull

Goal: This template can be used with DataFlows. At regular intervals, it polls (scans) a remote directory and sends each file found to a selected destination Virtual Path. Processed files can be filtered by:

  1. filename

  2. extension

Notes: Directories are skipped. Only the files in the specified remote directory are processed. DataWatcher will show one flow for every file found by LS and pulled.

Variables:

  • Cluster Cluster where LS and Pull operations will be executed

  • Select remote server Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path)

  • spLs_remotePath Remote path that Ls remote operation will scan looking for files to pull

  • File resource Resource profile to apply during the file pull operation

  • Actor, VFS, VFS path Virtual Path where the file will be placed

  • spLs_fileNameFilter Filter by file name on files from remote LS Note: the filter works in “filename contains word” mode

  • spLs_fileExtensionFilter Filter by file extension on files from remote Ls Note: the filter works in “extension contains word” mode

systemErrorHandlingWorkflowTemplateName Enable workflow error handling. In the input contract, setting the toggle button to ON, the Select error handling template drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft_logAllVariables and ehwft_logAllVariables templates in the section.

systemErrorHandlingWorkflowTemplateName Enable workflow error handling. In the input contract, setting the toggle button to ON, the Select error handling template drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft_logAllVariables and ehwft_logAllVariables templates in the section.

systemErrorHandlingWorkflowTemplateName Enable workflow error handling. In the input contract, setting the toggle button to ON, the Select error handling template drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft_logAllVariables and ehwft_logAllVariables templates in the section.

This system template automatically creates a . When defining a message-to-file action (M2FAction) within an input contract with trigger OnMessage, a Message listener is automatically created. A must exist to connect to and listen on a specific queue and to receive incoming JMS messages as they arrive. Upon receiving messages, the Message listener applies the selected to aggregate messages into files. Aggregated files are created in the specified Virtual Path. If multiple actions are configured in the Input contract, a Message listener is activated for each action.

The JMS connector's documentation can be found in the section.

If Parallelism is disabled, only one listener is active in one single STENG peer and messages are consumed and aggregated according to the Basic thresholds parameter set in the while preserving their order. If you need to be sure which messages make up a file (for example, the first three), you should disable parallelism. By disabling it, there is only one listener that queues the messages one at a time and creates the file on Data One when the configured threshold of the aggregation policy is reached. This is the only way to be sure that the file was created with the first n messages queued.

MSG_PROCESSING_RULE_M2F Select the aggregation Message Processing Rule to be applied. It must be configured in Setup → .

cronExpression You can either define a Cron expression with the GENERATE button or insert a Cron expression manually. When you click the GENERATE button a new window will appear. In the Generate window, you can define a Cron expression by seconds, minutes, hours, days, and months. Note that it is possible to configure one option per tab only. When an option is selected in one of the tabs, selecting one of the other options in the same tab will reset all previous settings. A detailed description of the window is available in the page. Note that if no option is defined for one or more tabs, the default option will automatically be set for your Cron expression.

Error handling The Enable workflow error handling option can be set to ON to use PRIMEUR system workflows to manage errors. Enabling the toggle switch, the Select error handling template drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft_logAllVariables and ehwft_logAllVariables templates in the section.

cronExpression You can either define a Cron expression with the GENERATE button or insert a Cron expression manually. When you click the GENERATE button a new window will appear. In the Generate window you can define a Cron expression by seconds, minutes, hours, days and months. Note that it is possible to configure one option per tab only. When an option is selected in one of the tabs, selecting one of the other options in the same tab will reset all previous settings. A detailed description of the window is available in the page. Note that if no option is defined for one or more tabs, the default option will automatically be set for your Cron expression.

🧱
Workflow Error Templates
Workflow Error Templates
Workflow Error Templates
Workflow Error Templates
Workflow Error Templates
Message listener
JMS client connection
Message Processing Rule
JMS Connector
Message Processing Rule
Message processing rule
How to... configure a CRON expression
Workflow Error Templates
How to... configure a CRON expression