# Workflow templates for input contracts

The following properties are common to all templates. They are also optional and therefore, can be skipped if not needed.

Common properties:

* **Error handling**\
  The **Enable workflow error handling** option can be set to ON to use PRIMEUR system workflows to manage errors. Enabling the toggle switch, the **Select error handling template** drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft\_logAllVariables and ehwft\_logAllVariables templates in the [Error Handling Workflow Templates](/data-mover-1.21/workflow-templates/error-handling-workflow-templates.md) section.
* **systemEnableLogDebugToFile**\
  Set this toggle button to ON to activate logs

## **Trigger type: On Demand**

The On Demand trigger does not require any mandatory variable.

Workflow templates triggered On-Demand are initiated through a direct REST API call. This API accepts a map of variables that are passed to the workflow instance as process variables.&#x20;

Any variable explicitly declared in the template to be required when creating the contract is added to the set of variables provided via the REST invocation. If a variable name appears both in the contract and in the map passed via REST, the value from the REST invocation takes precedence.

### **system\_onDemand\_exec**

**Goal:** System template with trigger [OnDemand](/data-mover-1.21/workflow-templates/triggers/ondemand.md) that performs a remote spExec operation.

**Variables:**

* **systemEnableLogDebugToFile**\
  Activate logs. In the input contract, set the toggle button to ON to activate logs.&#x20;
* **systemErrorHandlingWorkflowTemplateName**\
  Enable workflow error handling. In the input contract, setting the toggle button to ON, the **Select error handling template** drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft\_logAllVariables and ehwft\_logAllVariables templates in the [Error Handling Workflow Templates](/data-mover-1.21/workflow-templates/error-handling-workflow-templates.md) section.
* **Cluster**\
  STeng cluster where the spExec operation is performed.
* **executable**\
  Executable command to be run on the remote STeng shell.
* **arguments** \
  Arguments that will be passed to the selected executable command.

### **system\_onDemand\_putFile**

**Goal:** System template with trigger [OnDemand](/data-mover-1.21/workflow-templates/triggers/ondemand.md) that uploads a file to a virtual path.

**Variables:**

* **systemEnableLogDebugToFile**\
  Activate logs. In the input contract, set the toggle button to ON to activate logs.&#x20;
* **systemErrorHandlingWorkflowTemplateName**\
  Enable workflow error handling. In the input contract, setting the toggle button to ON, the **Select error handling template** drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft\_logAllVariables and ehwft\_logAllVariables templates in the [Error Handling Workflow Templates](/data-mover-1.21/workflow-templates/error-handling-workflow-templates.md) section.
* **Cluster**\
  Cluster where the operation will be executed.
* **fileName**\
  Name of the file that will be uploaded to the specified virtual path.
* **Source file path** \
  Path of the file.
* **Actor, VFS, VFS Path**\
  Virtual Path where the file will be placed.
* **File resource**\
  Resource profile to apply during the file Put operation.

### **system\_onDemand\_pull-archiveOpt-removeOpt**

**Goal**: System template with trigger [OnDemand](/data-mover-1.21/workflow-templates/triggers/ondemand.md) that pulls a specified file from a remote directory to the selected destination Virtual Path. Optionally, the pulled file can be archived and/or deleted at the source.

**Operations**:

1. Data Mover uses a Client Connection to connect to a remote system.
2. Data Mover pulls a specific file based on filename and path.
3. Data Mover optionally removes the original file.
4. Data Mover optionally archives the pulled file to a selected location, optionally adding a timestamp to filename following this pattern: \<original\_filename> \_yyyyMMddHHmmssSSS. \<original\_extension>.

When the archive option is selected, the original file is deleted irrespective of the selected remove option.

**Variables**:

* **systemEnableLogDebugToFile**\
  Activate logs. In the input contract, set the toggle button to ON to activate logs.&#x20;
* **systemErrorHandlingWorkflowTemplateName**\
  Enable workflow error handling. In the input contract, setting the toggle button to ON, the **Select error handling template** drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft\_logAllVariables and ehwft\_logAllVariables templates in the [Error Handling Workflow Templates](/data-mover-1.21/workflow-templates/error-handling-workflow-templates.md) section.
* **Cluster**\
  Cluster where the operation will be executed.
* **remotePathFile**\
  Remote path where the file to be pulled is located.
* **Select remote server**\
  Client Connection to be used to pull the remote file.
* **fileName**\
  Name of the file to be pulled.
* **File resource**\
  Resource profile to apply during the pull operation.
* **Actor, VFS, VFS Path**\
  Actor, VFS and Virtual Path where the file will be placed.
* **archiveFolder** \
  Archiving folder where the pulled file will be moved.
* **archiveRenameWithTimestamp** \
  (Required only if the archive folder is set) If enabled, the data file will be archived on the selected folder and renamed with a timestamp (yyyyMMddHHmmssSSS) following this pattern: \<original\_filename>\_yyyyMMddHHmmssSSS.\<original\_extension>
* **removePulledFile** (Required only if the archive folder is NOT set) If enabled, the pulled data file will be removed at the source after being pulled successfully (source files are always deleted when the archive option is selected)

**Limitation**: This workflow template cannot be used with the following protocols: POP3, IMAP, SMTP, JMS, PR4/S, PR5/S.&#x20;

### **system\_onDemand\_FEL\_v2**

**Goal**: System template with trigger [OnDemand](/data-mover-1.21/workflow-templates/triggers/ondemand.md) invoked by a running [FEL](/data-mover-1.21/file-event-listener/what-is-the-file-event-listener.md) instance that acquires a file. It requires only the destination virtual path where the file will be placed and an optional Resource Profile. All other parameters are managed by the FEL instance itself

**Variables**:

* **systemEnableLogDebugToFile**\
  Activate logs. In the input contract, set the toggle button to ON to activate logs.&#x20;
* **systemErrorHandlingWorkflowTemplateName**\
  Enable workflow error handling. In the input contract, setting the toggle button to ON, the **Select error handling template** drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft\_logAllVariables and ehwft\_logAllVariables templates in the [Error Handling Workflow Templates](/data-mover-1.21/workflow-templates/error-handling-workflow-templates.md) section.
* **Actor, VFS, VFS Path**\
  Actor, VFS and Virtual Path where the file will be placed.
* **File resource**\
  Resource profile to apply to the file while it is being written on the destination Virtual Path.

## **Trigger type: onMessage**

### **system\_onMessage\_listener**

**Goal:** System template automatically selected when choosing the onMessage trigger type in an Input contract that aggregates JMS messages into files.&#x20;

{% hint style="warning" %}
Unlike other contract actions, the message-to-file action (M2FAction) is **not associated with a workflow template** and does not execute workflows. The system\_onMessage\_listener workflow action is selected automatically when choosing the onMessage trigger type.
{% endhint %}

This system template automatically creates a [Message listener](/data-mover-1.21/transfer-protocols-and-connectors/jms-connector/aggregate-jms-messages-into-files/message-listener.md). When defining a message-to-file action (M2FAction) within an input contract with trigger OnMessage, a Message listener is automatically created. A [JMS client connection](/data-mover-1.21/transfer-protocols-and-connectors/client-connections/client-connection-jms.md) must exist to connect to and listen on a specific queue and to receive incoming JMS messages as they arrive. Upon receiving messages, the Message listener applies the selected [Message Processing Rule](/data-mover-1.21/transfer-protocols-and-connectors/jms-connector/message-processing-rules.md) to aggregate messages into files. Aggregated files are created in the specified Virtual Path. If multiple actions are configured in the Input contract, a Message listener is activated for each action.

{% hint style="info" %}
The JMS connector's documentation can be found in the [JMS Connector](/data-mover-1.21/transfer-protocols-and-connectors/jms-connector.md) section.
{% endhint %}

**Variables:**

* **TRANSFER\_PROFILE\_JMS**\
  Select the JMS client connection that will be used to connect to the queue, listen to it, and receive incoming JMS messages as they arrive. In the input contract, this is the **Select remote server** field.
* **CLUSTER**\
  Select the Cluster where the file will be pushed.
* **PARALLELISM**\
  The Parallelism parameter in the message to file action of the Input contract defines whether a listener should be activated on each STENG peer.
  * **If Parallelism is enabled**, a listener is active on each STENG peer and **STENG parallelism number** indicates the parallelism of each Message Listener. Messages are consumed in parallel, their order is not preserved. Additionally, since thresholds might be reached independently by each listener, messages could be aggregated into files in a non-deterministic manner.
  * **If Parallelism is disabled**, only one listener is active in one single STENG peer and messages are consumed and aggregated according to the **Basic thresholds** parameter set in the [Message Processing Rule](/data-mover-1.21/transfer-protocols-and-connectors/jms-connector/message-processing-rules.md) while preserving their order.\
    If you need to be sure which messages make up a file (for example, the first three), you should disable parallelism. By disabling it, there is only one listener that queues the messages one at a time and creates the file on Data One when the configured threshold of the aggregation policy is reached. This is the only way to be sure that the file was created with the first n messages queued.
* **JMS\_DESTINATION**\
  The name of the JMS source queue that the listener is listening to.
* **JMS\_DESTINATION\_TYPE**\
  Enter QUEUE or JNDI.
* **MSG\_PROCESSING\_RULE\_M2F**\
  Select the aggregation Message Processing Rule to be applied. It must be configured in **Setup** → [Message processing rule](/data-mover-1.21/transfer-protocols-and-connectors/jms-connector/message-processing-rules.md).
* **VFS**\
  Actor, VFS and virtual path where the file will be put.
* **FILE\_RESOURCES** (Optional) \
  Resource profile that will be applied when pushing the file. In the input contract, this variable corresponds to the **File Resource** field.
* **VIRTUAL\_BOX**\
  Virtual box associated with the file. In the input contract, this variable is in the **Select VirtualBox** field. A new virtual box can be created clicking the **Select VirtualBox** field.

## **Trigger type: Time**

A Time triggered workflow template requires a cron-expression.\
The instance starts when the time defined in the cron-expression is matched. No variables are added to the flow instance.

Common variables:

* **cronExpression**\
  You can either define a Cron expression with the **GENERATE** button or insert a Cron expression manually. When you click the **GENERATE** button a new window will appear. In the **Generate** window, you can define a Cron expression by seconds, minutes, hours, days, and months. Note that it is possible to configure one option per tab only. When an option is selected in one of the tabs, selecting one of the other options in the same tab will reset all previous settings. A detailed description of the window is available in the [How to... configure a CRON expression](/data-mover-1.21/how-to.../...-configure-a-cron-expression.md) page.\
  **Note** that if no option is defined for one or more tabs, the default option will automatically be set for your Cron expression.

### **system\_time\_flagFile-pull-archiveOpt-removeOpt**

**Goal**: Time-triggered System Template that pulls a specific file from a remote directory to the selected destination Virtual Path. Optionally, a flag file can be specified so that the data file is pulled only when the presence of the flag file on the specified folder has been verified. The pulled data file can optionally be archived and/or deleted at the source.

**Operations**:

1. Data Mover uses a Client Connection to connect to a remote system to check the presence of a flag file.
2. Data Mover uses the same Client Connection to connect to the remote system and pull a specific file based on path and filename.
3. As an option, the flag file can be removed once the file is successfully pulled.
4. Optionally, Data Mover removes the original file.
5. Optionally, Data Mover archives the pulled file to a selected location.
6. Optionally, Data Mover adds a timestamp to the filename following this pattern: \<original\_filename> \_yyyyMMddHHmmssSSS. \<original\_extension>.

When the archive option is selected, the original file is deleted irrespective of the selected remove option.

**Variables**:

* **fileName**\
  Name of the data file to be pulled.
* **systemWorkflowInstanceSingletonMode**\
  The **Single Instance** option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager and will start a new instance only when the previous one is terminated.\
  **Note** that time-triggered workflows have the **Single Instance** toggle set to true by default.
* **Cluster**\
  Cluster where LS and Pull operations will be executed
* **remotePathFile**\
  Remote folder where the data file to be pulled is located
* **Select remote server** Transfer profile (or Client Connection) to be used to pull the remote file
* **fileName**\
  Name of the file to be pulled
* **Actor, VFS, VFS path**\
  Virtual Path where the file will be placed
* **File resource** Resource profile to apply during the file pull operation
* **archiveFolder** \
  If set, the pulled data file will be remotely archived in this folder and deleted from the source after being pulled successfully
* **archiveRenameWithTimestamp** (Required only if the archive folder is set) If enabled, the data file will be archived on the selected folder and renamed with a timestamp (yyyyMMddHHmmssSSS) following this pattern: \<original\_filename>\_yyyyMMddHHmmssSSS.\<original\_extension>
* **removePulledFile** (Required only if the archive folder is NOT set) If enabled, the pulled data file will be removed at the source after being pulled successfully (source files are always deleted when the archive option is selected)
* **flagFileName** Optionally, a flag file can be specified so that the data file is pulled only when the flag file is in the specified folder. The pulled data file can optionally be archived and/or deleted at the source.
* **remotePathFlagFile** (Required only if Flag file name is set) Remote folder where Flag file presence will be checked
* **removeFlagFile** (Required only if the Flag file name is set) If enabled, the flag file will be removed after a successful data file pull.

**Limitation**: This workflow template cannot be used with the following protocols: POP3, IMAP, SMTP, JMS, PR4/S, PR5/S.&#x20;

### **system\_time\_remoteLs-loop-pull**

**Goal**: At regular intervals, this template polls a remote directory and sends each file found to a selected destination Virtual Path.\
Processed files can be filtered by:

1. filename
2. extension

**Notes**: Directories are skipped. Only the files in the specified remote directory are processed.\
Data Watcher will show one flow with all the files found by LS and pulled.

**Variables**:

* **systemWorkflowInstanceSingletonMode**\
  The **Single Instance** option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.\
  **Note** that time-triggered workflows have the **Single Instance** toggle set to true by default.
* **Cluster**\
  Cluster where LS and Pull operations will be executed
* **Select remote server**\
  Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path)
* **spLs\_remotePath** Remote path that Ls remote operation will scan looking for files to pull
* **File resource**\
  Resource profile to apply during the file Pull operation
* **Actor, VFS, VFS path**\
  Actor, VFS and Virtual Path where the file will be placed
* **spLs\_fileNameFilter**\
  Filter by file name on files from remote LS\
  **Note**: the filter works in “filename contains word” mode
* **spLs\_fileExtensionFilter**\
  Filter by file extension on files from remote Ls\
  **Note**: the filter works in “extension contains word” mode

**Limitation**: This workflow template cannot be used with the following protocols: POP3, IMAP, SMTP, JMS, PR4/S, PR5/S.&#x20;

### **system\_time\_putFile**

**Goal**: Time-triggered system template that will operate a "put" from a folder to a virtual path.

**Variables**:

* **systemWorkflowInstanceSingletonMode**\
  The **Single Instance** option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.\
  **Note** that time-triggered workflows have the **Single Instance** toggle set to true by default.
* **Cluster**\
  Cluster where LS and Pull operations will be executed
* **fileName**\
  File Name that will be get
* **source** Source file path
* **Actor, VFS, VFS Path**\
  Actor, VFS and Virtual Path where the file will be placed
* **File Resource** Resource profile to apply during the file Pull operation

### **system\_time\_remoteLs-loop-pull-archiveOpt-removeOpt**

**Goal**: At regular intervals, this template polls a remote directory and pulls each file found to the selected destination Virtual Path.\
After a file is SUCCESSFULLY pulled, if a remote folder for the archive was specified, that file will be MOVED from its original position to the archive folder. Then the process ends.\
After a file has been SUCCESSFULLY pulled, if a remote folder for the archive was NOT specified and the boolean slider "REMOVE" was set at the contract level, the file will be remotely DELETED.\
Processed files can be filtered by filename or extension.

**Notes**: Directories are skipped, only files in the specified remote directory are processed.

**Variables**:

* **systemWorkflowInstanceSingletonMode**\
  The **Single Instance** option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.\
  **Note** that time-triggered workflows have the **Single Instance** toggle set to true by default.
* **Cluster**\
  Cluster where the operation will be executed
* **Select remote server**\
  Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path)
* **spLs\_remotePath** Remote path that Ls remote operation will scan looking for files to pull
* **File resource**\
  Resource profile to apply during the file Pull operation
* **Actor, VFS, VFS Path**\
  Actor, VFS and Virtual Path where the file will be placed
* **spLs\_fileNameFilter**\
  Filter by file name on files from remote LS\
  **Note**: the filter works in “filename contains word” mode
* **spLs\_fileExtensionFilter**\
  Filter by file extension on files from remote Ls\
  **Note**: the filter works in “extension contains word” mode
* **archive\_folder**\
  If set, the file pulled successfully will be remotely archived in this folder
* **spRm\_removePulledFile**\
  If set, the pulled file will be removed after being pulled successfully, if not archived with the archive\_folder variable.

**Limitation**: This workflow template cannot be used with the following protocols: POP3, IMAP, SMTP, JMS, PR4/S, PR5/S.&#x20;

### **system\_time\_flagFile-remoteLs-loop-pull-archiveOpt-removeOpt**

**Goal**: At regular intervals, this template polls a remote directory. Each file found in the file directory matching optional filters on file name and extension will be pulled to the selected destination Virtual Path.\
Optionally, a flag file can be specified so that files are pulled only after verifying that the flag file exists in the specified folder. Pulled files can optionally be archived and/or deleted at the source.

**Operation**:

1. Data Mover connects to a remote system via a Client Connection to check the presence of a flag file.
2. Data Mover uses the same Client Connection to connect to a remote system, reads contents of a remote folder and pulls files based on filename and extension filters (embedded folders are ignored).
3. As an option, the flag file can be removed once the file is pulled successfully.
4. Data Mover optionally removes original files.
5. Data Mover optionally archives pulled files to a selected location.
6. Data Mover optionally adds a timestamp to the filename following this pattern: \<original\_filename>\
   \_yyyyMMddHHmmssSSS. \<original\_extension>.

When the archive option is selected, original files are deleted irrespective of the selected remove option.

**Variables**:

* **systemWorkflowInstanceSingletonMode**\
  The **Single Instance** option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.\
  **Note** that time-triggered workflows have the **Single Instance** toggle set to true by default.
* **Cluster**\
  Cluster where the operation will be executed
* **Select remote server**\
  Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path)
* **spLs\_remotePath** Remote path that Ls remote operation will scan looking for files to pull
* **spLs\_fileNameFilter** \
  Filter by file name on files from remote LS\
  **Note**: the filter works in “filename contains word” mode
* **spLs\_fileExtensionFilter** \
  Filter by file extension on files from remote Ls\
  **Note**: the filter works in “extension contains word” mode
* **Actor, VFS, VFS Path**\
  Actor, VFS and Virtual Path where the file will be placed
* **File resource**\
  Resource profile to apply during the file Pull operation
* **archive\_folder**\
  If set, the file pulled successfully will be remotely archived in this folder
* **archiveRenameWithTimestamp**\
  Applicable only if the archive folder is set. When enabled, the file will be archived in the selected folder and renamed with a timestamp. This pattern is applied to the renamed file: \<original\_filename>\_yyyyMMddHHmmssSS.\
  \<original\_extension>.
* **removePulledFile** \
  Applicable only if the archive folder is not set. If enabled, the pulled file is removed at the source after being successfully pulled (source files are always deleted when the archive option is selected)
* **flagFileName** Flag file filename\
  If set, before trying to pull files, the workflow will check the presence of a Flag file with this filename and extension in the selected folder.
* **remotePathFlagFile**\
  Flag file folder\
  Applicable only if the Flag file name is set. Remote folder where the Flag file will be verified
* **removeFlagFile** \
  Applicable only if the Flag file name is set. If enabled, the Flag file will be removed after successfully pulling the file.

**Limitation**: This workflow template cannot be used with the following protocols: POP3, IMAP, SMTP, JMS, PR4/S, PR5/S.&#x20;

## DataFlow templates

These are templates that can be used with DataFlows.

Common variables:

* **Error handling**\
  The **Enable workflow error handling** option can be set to ON to use PRIMEUR system workflows to manage errors. Enabling the toggle switch, the **Select error handling template** drop-down list will appear with 2 templates: one to log all variables and one to notify errors. You can find details about the ehwft\_logAllVariables and ehwft\_logAllVariables templates in the [Error Handling Workflow Templates](/data-mover-1.21/workflow-templates/error-handling-workflow-templates.md) section.
* **systemEnableLogDebugToFile**\
  Set this toggle button to ON to activate logs.
* **systemWorkflowInstanceSingletonMode**\
  The **Single Instance** option is present only in time-triggered workflows and its goal is to avoid that time-triggered instances overlap. By setting this option to true (default), any change applied to a contract will automatically trigger a different correlation key, which is passed to the Job Manager that will start a new instance only when the previous one is terminated.\
  **Note** that time-triggered workflows have the **Single Instance** toggle set to true by default.

## **Trigger type: Time**

A Time triggered workflow template requires a cron-expression.\
The instance starts when the time defined in the cron-expression is matched. No variables are added to the flow instance.

Common variables:

* **cronExpression**\
  You can either define a Cron expression with the **GENERATE** button or insert a Cron expression manually. When you click the **GENERATE** button a new window will appear. In the **Generate** window you can define a Cron expression by seconds, minutes, hours, days and months. Note that it is possible to configure one option per tab only. When an option is selected in one of the tabs, selecting one of the other options in the same tab will reset all previous settings. A detailed description of the window is available in the [How to... configure a CRON expression](/data-mover-1.21/how-to.../...-configure-a-cron-expression.md) page.\
  **Note** that if no option is defined for one or more tabs, the default option will automatically be set for your Cron expression.

### **df\_time\_remoteLs-loop-pull-archiveOpt-removeOpt**

**Goal**: This template can be used with DataFlows. At regular intervals, it polls a remote directory and pulls each file to the selected Virtual Path.\
Once a file is SUCCESSFULLY pulled, if a remote folder for archive is specified, the file will be MOVED from its original position to the archive folder. Then the process ends.\
Once a file is SUCCESSFULLY pulled, if a remote folder for archive is NOT specified, if the boolean slider "REMOVE" is set at contract level, the file will be DELETED from the remote server.\
Processed files can be filtered by filename or extension.

**Notes**: Directories are skipped, only files in the specified remote directory are processed. Data Watcher will show one flow for every file found by LS and pulled.

**Variables**:

* **Cluster**\
  Cluster where LS and Pull operations will be executed.
* **Select remote server**\
  Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path)
* **spLs\_remotePath** Remote path that Ls remote operation will scan looking for files to pull
* **File resource**\
  Resource profile to apply during the file pull operation
* **Actor, VFS, VFS path**\
  Virtual Path where the file will be placed
* **spLs\_fileNameFilter** \
  Filter by file name on files from remote LS\
  **Note**: the filter works in “filename contains word” mode
* **spLs\_fileExtensionFilter** Filter by file extension on files from remote Ls\
  **Note**: the filter works in “extension contains word” mode
* **archive\_folder**\
  If set, the file pulled successfully will be remotely archived in this folder
* **spRm\_removePulledFile** \
  If set, the pulled file will be removed after being pulled successfully, if not archived with the archive\_folder variable.

**Limitation**: This workflow template cannot be used with the following protocols: POP3, IMAP, SMTP, JMS, PR4/S, PR5/S.&#x20;

### **df\_time\_remoteLs-loop-pull**

**Goal**: This template can be used with DataFlows. At regular intervals, it polls (scans) a remote directory and sends each file found to a selected destination Virtual Path.\
Processed files can be filtered by:

1. filename
2. extension

**Notes**: Directories are skipped. Only the files in the specified remote directory are processed.\
DataWatcher will show one flow for every file found by LS and pulled.

**Variables**:

* **Cluster**\
  Cluster where LS and Pull operations will be executed
* **Select remote server**\
  Remote server where the polled directory is located (It polls a remote directory and each file found in the directory is pulled to the selected destination Virtual Path)
* **spLs\_remotePath** Remote path that Ls remote operation will scan looking for files to pull
* **File resource**\
  Resource profile to apply during the file pull operation
* **Actor, VFS, VFS path**\
  Virtual Path where the file will be placed
* **spLs\_fileNameFilter** \
  Filter by file name on files from remote LS\
  **Note**: the filter works in “filename contains word” mode
* **spLs\_fileExtensionFilter** Filter by file extension on files from remote Ls\
  **Note**: the filter works in “extension contains word” mode

**Limitation**: This workflow template cannot be used with the following protocols: POP3, IMAP, SMTP, JMS, PR4/S, PR5/S.&#x20;


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.primeur.com/data-mover-1.21/workflow-templates/system-workflow-templates/workflow-templates-for-input-contracts.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
