Input templates
System templates
The following properties are common to all templates. They are also optional and therefore, can be skipped if not needed.
Common properties:
-
Error handling
The Enable workflow error handling option can be set to ON to use PRIMEUR system workflows to manage errors. Enabling the toggle switch, the Select error handling template drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft_logAllVariables and ehwft_logAllVariables templates in the Workflow Error Templates section. -
systemEnableLogDebugToFile
Set this toggle button to ON to activate logs
Trigger type: On Demand
The On Demand trigger does not require any mandatory variable.
The On Demand triggered workflow templates are triggered by a direct API invocation via REST. This API will accept a map of variables that will be passed to the instance as process variables. Any variable explicitly declared on the template to be requested when creating the Contract is added to the set of variables coming from the REST invocation. If a variable name is used when creating the Contract and is used in the map passed via REST as well, this second one wins over the Contract one.
1. SpExec operation
system_onDemand_exec
Goal: System template that will perform a remote spExec operation when onDemand triggered.
Variables:
-
Cluster
Steng cluster where the spExec operation is performed -
executable
Executable command to be run on the remote steng shell -
arguments
Arguments that will be passed to the selected executable command
2. Put file
system_onDemand_putFile
Goal: System template On Demand triggered that will operate a "put" from a folder to a virtual path.
Variables:
-
Cluster
Cluster where the operation will be executed -
fileName
File Name that will be uploaded to the specified virtual path -
Source file path
Path of the file -
Actor, VFS, VFS Path
Virtual Path where the file will be placed -
File resource
Resource profile to apply during the file Put operation
3. Pull file and optionally archive and/or delete it
system_onDemand_pull-archiveOpt-removeOpt
Goal: On-demand triggered System Template that pulls a specified file from a remote directory to the selected destination Virtual Path. Optionally, the pulled file can be archived and/or deleted at the source.
Operation:
- Data One uses a Client Connection to connect to a remote system.
- Data One pulls a specific file based on filename and path.
- Data One optionally removes the original file.
- Data One optionally archives the pulled file to a selected location, optionally adding a timestamp to filename following this pattern: <original_filename> _yyyyMMddHHmmssSSS. <original_extension>.
When the archive option is selected, the original file is deleted irrespective of the selected remove option.
Variables:
-
Cluster
Cluster where the operation will be executed -
remotePathFile
Remote path where the file to be pulled is located -
Select remote server
Transfer profile (or Client Connection) to be used to pull the remote file -
fileName
Name of the file to be pulled -
File resource
Resource profile to apply during the pull operation -
Actor, VFS, VFS Path
Actor, VFS and Virtual Path where the file will be placed -
archiveFolder
Archiving folder where the pulled file will be moved -
archiveRenameWithTimestamp
(Required only if the archive folder is set) If enabled, the data file will be archived on the selected folder and renamed with a timestamp (yyyyMMddHHmmssSSS) following this pattern: <original_filename>_yyyyMMddHHmmssSSS.<original_extension> -
removePulledFile
(Required only if the archive folder is NOT set) If enabled, the pulled data file will be removed at the source after being pulled successfully (source files are always deleted when the archive option is selected)
4. FEL v2
system_onDemand_FEL_v2
Goal: OnDemand triggered System Template created to be fed by the new FEL feature. Eligible for DataFlow too. It will be invoked by a running FEL instance and will take care to perform the acquisition of a file after FEL has been triggered by it according to the FEL configuration. Will ask only for the destination virtual path and an optional Resource Profile at contract level, all other parameters are managed by the FEL instance itself.
Variables:
-
Actor, VFS, VFS Path
Actor, VFS and Virtual Path where the file will be placed -
File resource
Resource profile to apply to the file while it is being written on the destination Virtual Path
Trigger type: Time
A Time triggered workflow template requires a cron-expression.
The instance starts when the time defined in the cron-expression is matched. No variables are added to the flow instance.
Common variables:
- cronExpression
You can either define a Cron expression with the GENERATE button or insert a Cron expression manually. When you click the GENERATE button a new window will appear. In the Generate window you can define a Cron expression by seconds, minutes, hours, days and months. Note that it is possible to configure one option per tab only. When an option is selected in one of the tabs, selecting one of the other options in the same tab will reset all previous settings. A detailed description of the window is available in the How to... configure a CRON expression page.
Note that if no option is defined for one or more tabs, the default option will automatically be set for your Cron expression.
1. Pull file and optionally archive, delete it and/or specify a flag file
system_time_flagFile-pull-archiveOpt-removeOpt
Goal: Time triggered System Template that pulls a specific file from a remote directory to the selected destination Virtual Path. Optionally, a flag file can be specified so that the data file is pulled only when the presence of the flag file on the specified folder has been verified. The pulled data file can optionally be archived and/or deleted at the source.
Operation:
- Data One uses a Client Connection to connect to a remote system to check the presence of a flag file.
- Data One uses the same Client Connection to connect to the remote system and pull a specific file based on path and filename.
- As an option, the flag file can be removed once the file is successfully pulled.
- Optionally, Data One removes the original file.
- Optionally, Data one archives the pulled file to a selected location.
- Optionally, Data One adds a timestamp to the filename following this pattern: <original_filename> _yyyyMMddHHmmssSSS. <original_extension>.
When the archive option is selected, the original file is deleted irrespective of the selected remove option.
Variables:
-
fileName
Name of the data file to be pulled. -
systemWorkflowInstanceSingletonMode
The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.
Note that time-triggered workflows have the Single Instance toggle set to true by default. -
Cluster
Cluster where LS and Pull operations will be executed -
remotePathFile
Remote folder where the data file to be pulled is located -
Select remote server
Transfer profile (or Client Connection) to be used to pull the remote file -
fileName
Name of the file to be pulled -
Actor, VFS, VFS path
Virtual Path where the file will be placed -
File resource
Resource profile to apply during the file pull operation -
archiveFolder
If set, the pulled data file will be remotely archived in this folder and deleted from the source after being pulled successfully -
archiveRenameWithTimestamp
(Required only if the archive folder is set) If enabled, the data file will be archived on the selected folder and renamed with a timestamp (yyyyMMddHHmmssSSS) following this pattern: <original_filename>_yyyyMMddHHmmssSSS.<original_extension> -
removePulledFile
(Required only if the archive folder is NOT set) If enabled, the pulled data file will be removed at the source after being pulled successfully (source files are always deleted when the archive option is selected) -
flagFileName
Optionally, a flag file can be specified so that the data file is pulled only when the flag file is in the specified folder. The pulled data file can optionally be archived and/or deleted at the source. -
remotePathFlagFile
(Required only if Flag file name is set) Remote folder where Flag file presence will be checked -
removeFlagFile
(Required only if Flag file name is set) If enabled, the flag file will be removed after successful data file pull.
2. Pull file on a selected Virtual Path
system_time_remoteLs-loop-pull
Goal: At regular intervals, this template polls a remote directory and sends each file found to a selected destination Virtual Path.
Processed files can be filtered by:
- filename
- extension
Notes: Directories are skipped. Only the files in the specified remote directory are processed.
Data Watcher will show one flow with all the files found by LS and pulled.
Variables:
-
systemWorkflowInstanceSingletonMode
The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.
Note that time-triggered workflows have the Single Instance toggle set to true by default. -
Cluster
Cluster where LS and Pull operations will be executed -
Select remote server
Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path) -
spLs_remotePath
Remote path that Ls remote operation will scan looking for files to pull -
File resource
Resource profile to apply during the file Pull operation -
Actor, VFS, VFS path
Actor, VFS and Virtual Path where the file will be placed
-
spLs_fileNameFilter
Filter by file name on files from remote LS
Note: the filter works in “filename contains word” mode -
spLs_fileExtensionFilter
Filter by file extension on files from remote Ls
Note: the filter works in “extension contains word” mode
3. Put file
system_time_putFile
Goal: Time triggered system template that will operate a "put" from a folder to a virtual path.
Variables:
-
systemWorkflowInstanceSingletonMode
The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.
Note that time-triggered workflows have the Single Instance toggle set to true by default. -
Cluster
Cluster where LS and Pull operations will be executed -
fileName
File Name that will be get -
source
Source file path -
Actor, VFS, VFS Path
Actor, VFS and Virtual Path where the file will be placed -
File Resource
Resource profile to apply during the file Pull operation
4. Pull file from a remote directory and optionally archive and/or delete it
system_time_remoteLs-loop-pull-archiveOpt-removeOpt
Goal: At regular intervals, this template polls a remote directory and pulls each file found to the selected destination Virtual Path.
After a file is SUCCESSFULLY pulled, if a remote folder for archive was specified, that file will be MOVED from its original position to the archive folder. Then the process ends.
After a file has been SUCCESSFULLY pulled, if a remote folder for archive was NOT specified, if the boolean slider "REMOVE" was set at contract level, the file will be remotely DELETED.
Processed files can be filtered by filename or extension.
Notes: Directories are skipped, only files in the specified remote directory are processed.
Variables:
-
systemWorkflowInstanceSingletonMode
The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.
Note that time-triggered workflows have the Single Instance toggle set to true by default. -
Cluster
Cluster where the operation will be executed -
Select remote server
Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path) -
spLs_remotePath
Remote path that Ls remote operation will scan looking for files to pull -
File resource
Resource profile to apply during the file Pull operation -
Actor, VFS, VFS Path
Actor, VFS and Virtual Path where the file will be placed
-
spLs_fileNameFilter
Filter by file name on files from remote LS
Note: the filter works in “filename contains word” mode -
spLs_fileExtensionFilter
Filter by file extension on files from remote Ls
Note: the filter works in “extension contains word” mode -
archive_folder
If set, the file pulled successfully will be remotely archived in this folder -
spRm_removePulledFile
If set, the pulled file will be removed after being pulled successfully, if not archived with the archive_folder variable.
5. Pull file from a remote directory and optionally archive, delete it and/or specify a flag file
system_time_flagFile-remoteLs-loop-pull-archiveOpt-removeOpt
Goal: At regular intervals, this template polls a remote directory. Each file found in the file directory matching optional filters on file name and extension will be pulled to the selected destination Virtual Path.
Optionally, a flag file can be specified so that files are pulled only after verifying that the flag file exists in the specified folder. Pulled files can optionally be archived and/or deleted at the source.
Operation:
- Data One connects to a remote system via a Client Connection to check the presence of a flag file.
- Data One uses the same Client Connection to connect to a remote system, reads contents of a remote folder and pulls files based on filename and extension filters (embedded folders are ignored).
- As an option, the flag file can be removed once the file is pulled successfully.
- Data One optionally removes original files.
- Data One optionally archives pulled files to a selected location.
- Data One optionally adds a timestamp to the filename following this pattern: <original_filename>
_yyyyMMddHHmmssSSS. <original_extension>.
When the archive option is selected, original files are deleted irrespective of the selected remove option.
Variables:
-
systemWorkflowInstanceSingletonMode
The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.
Note that time-triggered workflows have the Single Instance toggle set to true by default. -
Cluster
Cluster where the operation will be executed -
Select remote server
Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path) -
spLs_remotePath
Remote path that Ls remote operation will scan looking for files to pull -
spLs_fileNameFilter
Filter by file name on files from remote LS
Note: the filter works in “filename contains word” mode -
spLs_fileExtensionFilter
Filter by file extension on files from remote Ls
Note: the filter works in “extension contains word” mode -
Actor, VFS, VFS Path
Actor, VFS and Virtual Path where the file will be placed -
File resource
Resource profile to apply during the file Pull operation -
archive_folder
If set, the file pulled successfully will be remotely archived in this folder -
archiveRenameWithTimestamp
Applicable only if the archive folder is set. When enabled, the file will be archived in the selected folder and renamed with a timestamp. This pattern is applied to the renamed file: <original_filename>_yyyyMMddHHmmssSS.
<original_extension>. -
removePulledFile
Applicable only if the archive folder is not set. If enabled, the pulled file is removed at the source after being successfully pulled (source files are always deleted when the archive option is selected) -
flagFileName
Flag file filename
If set, before trying to pull files, the workflow will check the presence of a Flag file with this filename and extension in the selected folder. -
remotePathFlagFile
Flag file folder
Applicable only if the Flag file name is set. Remote folder where the Flag file will be verified -
removeFlagFile
Applicable only if the Flag file name is set. If enabled, the Flag file will be removed after successfully pulling the file.
DataFlow templates
These are templates that can be used with DataFlows.
Common variables:
-
Error handling
The Enable workflow error handling option can be set to ON to use PRIMEUR system workflows to manage errors. Enabling the toggle switch, the Select error handling template drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft_logAllVariables and ehwft_logAllVariables templates in the Workflow Error Templates section. -
systemEnableLogDebugToFile
Set this toggle button to ON to activate logs. -
systemWorkflowInstanceSingletonMode
The Single Instance option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.
Note that time-triggered workflows have the Single Instance toggle set to true by default.
Trigger type: Time
A Time triggered workflow template requires a cron-expression.
The instance starts when the time defined in the cron-expression is matched. No variables are added to the flow instance.
Common variables:
- cronExpression
You can either define a Cron expression with the GENERATE button or insert a Cron expression manually. When you click the GENERATE button a new window will appear. In the Generate window you can define a Cron expression by seconds, minutes, hours, days and months. Note that it is possible to configure one option per tab only. When an option is selected in one of the tabs, selecting one of the other options in the same tab will reset all previous settings. A detailed description of the window is available in the How to... configure a CRON expression page.
Note that if no option is defined for one or more tabs, the default option will automatically be set for your Cron expression.
1. Pull file from a remote directory and optionally archive and/or delete it
df_time_remoteLs-loop-pull-archiveOpt-removeOpt
Goal: This template can be used with DataFlows. At regular intervals, it polls a remote directory and pulls each file to the selected Virtual Path.
Once a file is SUCCESSFULLY pulled, if a remote folder for archive is specified, the file will be MOVED from its original position to the archive folder. Then the process ends.
Once a file is SUCCESSFULLY pulled, if a remote folder for archive is NOT specified, if the boolean slider "REMOVE" is set at contract level, the file will be DELETED from the remote server.
Processed files can be filtered by filename or extension.
Notes: Directories are skipped, only files in the specified remote directory are processed. Data Watcher will show one flow for every file found by LS and pulled.
Variables:
-
Cluster
Cluster where LS and Pull operations will be executed. -
Select remote server
Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path) -
spLs_remotePath
Remote path that Ls remote operation will scan looking for files to pull -
File resource
Resource profile to apply during the file pull operation -
Actor, VFS, VFS path
Virtual Path where the file will be placed
-
spLs_fileNameFilter
Filter by file name on files from remote LS
Note: the filter works in “filename contains word” mode -
spLs_fileExtensionFilter
Filter by file extension on files from remote Ls
Note: the filter works in “extension contains word” mode -
archive_folder
If set, the file pulled successfully will be remotely archived in this folder -
spRm_removePulledFile
If set, the pulled file will be removed after being pulled successfully, if not archived with the archive_folder variable.
2. Pull file from a remote directory
df_time_remoteLs-loop-pull
Goal: This template can be used with DataFlows. At regular intervals, it polls (scans) a remote directory and sends each file found to a selected destination Virtual Path.
Processed files can be filtered by:
- filename
- extension
Notes: Directories are skipped. Only the files in the specified remote directory are processed.
DataWatcher will show one flow for every file found by LS and pulled.
Variables:
-
Cluster
Cluster where LS and Pull operations will be executed -
Select remote server
Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path) -
spLs_remotePath
Remote path that Ls remote operation will scan looking for files to pull -
File resource
Resource profile to apply during the file pull operation -
Actor, VFS, VFS path
Virtual Path where the file will be placed
-
spLs_fileNameFilter
Filter by file name on files from remote LS
Note: the filter works in “filename contains word” mode -
spLs_fileExtensionFilter
Filter by file extension on files from remote Ls
Note: the filter works in “extension contains word” mode
Updated 9 months ago