Primeur Online Docs
Data Mover 1.20
Data Mover 1.20
  • 🚀GETTING STARTED
    • What is Primeur Data Mover
    • Main features of Primeur Data Mover
    • Primeur Data Mover deployment
    • Navigate through Primeur Data Mover
  • 👥Actors
    • Who are the actors
    • Create your first actor
    • Configure an actor 🚀
      • Users Tab
      • Groups Tab
      • VFS Tab
      • File Resource Tab
      • Connection Contract Tab
      • Client Connections Tab
    • Search files by actor
    • Actor Lineage 🚀
      • Aggregation of flows by protocol 🚀
      • Lineage with connection contracts 🚀
      • Lineage with input, mediation and output contracts 🚀
      • Lineage with any contract type 🚀
  • 📝Contracts
    • What is a contract
    • Create your first contract
      • Create an Input Contract
        • Define the contract info
        • Associate the contract with the actor
        • Define the contract actions
        • Set the contract variables
      • Create a Mediation Contract
      • Create an Output Contract
      • Create a Connection Contract
        • Create a contract clause
        • Associate the VFS with file processing rules
        • File Processing Rules
    • Managing contracts 🚀
  • 🧱Workflows
    • What is a workflow
    • Create your first workflow template
    • Trigger types
      • Trigger types for input contracts
      • Trigger types for mediation and output contracts
    • Service tasks
      • Standard service tasks
      • Triggerable service tasks 🚀
      • Spazio selectors and filebox metadata management
      • Error management
    • Variables
      • Variables in workflows and contracts
      • Handling process variables
    • Workflow templates
      • System workflow templates
        • Workflow templates for input contracts
        • Workflow templates for mediation contracts
        • Workflow templates for output contracts
      • Custom workflow templates
        • Workflow template toolbar
        • Workflow template Shape repository panel
        • Workflow template working area
        • Workflow template BPMN-diagram panel
      • Error workflow templates
    • Editing workflow templates
    • DataFlow Instance Context (DFIC) 🚀
  • 🔓Security
    • Identity and Access Management
    • Users & Groups
      • Setting the password policy
      • Creating Internal Users 🚀
      • Creating Internal Groups
      • Creating External Users
      • Creating External Groups
    • Key Stores and Trust Stores
      • Key Store 🚀
        • Creating a Key 🚀
        • Creating a Certificate 🚀
        • Importing a Key or a Certificate
        • Creating a Symmetric key
        • Examples
      • Trust Store 🚀
        • Importing Keys 🚀
        • Importing Certificates
      • Untrusted Cache 🚀
      • Trusting an element
        • When do I use the Keys tab?
        • When do I use the Certificates tab?
      • PGP Key Store / PGP Trust Store
        • Configuring the PGP Key Store
        • Importing keys into the PGP Trust Store
  • 🛸TRANSPORT PROTOCOLS AND CONNECTORS
    • Data Mover client and server roles
    • Client Connections
      • Client Connection: FTP
      • Client Connection: FTPS
      • Client Connection: SFTP
      • Client Connection: HTTP
      • Client Connection: HTTPS
      • Client Connection: PESIT
      • Client Connection: SMB v3 or later versions
      • Client Connection: POP3 IMAP
      • Client Connection: SMTP
      • Client Connection: PR4/PR4S
      • Client Connection: PR5
      • Client Connection: PR5S
      • Client Connection: HDFS
      • Client Connection: HDFSS
      • Client Connection: Amazon S3 🚀
      • Client Connection: Google Cloud Storage
        • Credentials
      • Client Connection: Azure Blob Storage
      • Client Connection: IBM Sterling Connect:Direct
      • Appendix
    • Server Connections 🚀
      • Server Connection: FTP
      • Server Connection: FTPS
      • Server Connection: SFTP
      • Server Connection: HTTP
      • Server Connection: HTTPS
      • Server Connection: PeSIT
      • Server Connection: PR4
      • Server Connection: PR5
      • Server Connection: PR5S 🚀
      • Server Connection: IBM Sterling Connect:Direct
    • Stopping all servers in one go
  • 🛰️DMZ GATEWAYS
    • DMZ Gateways
    • DMZ Clusters
  • 🎧FILE EVENT LISTENER
    • What is the File Event Listener
    • Configuring File Event Listeners
      • Setting the File Event Listener Engine
      • Defining a contract for the File Event Listener
      • Setting events to be monitored
    • RegEx Rules 🚀
    • Monitoring File Event Listeners
  • 🔍ICAP
    • ICAP Engines
    • Configuring an ICAP Engine
    • Defining an ICAP rule
  • 📚CLUSTERING
    • STENG, Clusters and Servers
    • Adding a cluster and a STENG
    • Deleting a STENG
  • 🕒MONITORING
    • Jobs
      • Details about Jobs 🚀
      • jobman.sh CLI
    • Job Manager
    • Job Queues
      • Managing Job Queues
    • File Transfers
      • Ongoing
      • Finished
      • Reports
    • File Transfers Rules
      • Configuring Rules
  • 🤓ADMINISTRATION
    • Storage Classes 🚀
      • Storage Class: File System 🚀
      • Storage Class: SMB v3 or later versions 🚀
      • Storage Class: Amazon S3 🚀
      • Storage Class: Google Cloud Storage 🚀
      • Storage Class: Azure Blob Storage 🚀
    • Retention Classes
    • Virtual File Systems (VFS) 🚀
      • Creating a VFS 🚀
      • Configuring a VFS
      • Adding Virtual Paths
      • Modifying and Deleting a VFS
      • Searching files in all VFS
    • File Resources
      • Creating File Resources
      • Navigating File Resources
      • How to use File Resources
    • Advanced Settings
  • 👑FILE MANAGER
    • Getting started
    • Logging into File Manager
    • Managing the File Manager 🚀
      • The list of results
      • Creating new folders
      • Uploading files
      • Downloading files 🚀
      • Searching for files and folders
      • Deleting files 🚀
      • Bulk actions 🚀
    • File Manager and VFS
    • Customizing File Manager externals
      • The configuration-wui.json file 🚀
      • How to customize the Login window and the logo
      • How to customize the footer
      • How to configure the Upload with Metadata option
      • How to customize bulk actions 🚀
  • 🧑‍⚖️FILE ROUTING
    • What is File Routing 🚀
    • Routing Rules page
      • The Rules tab
      • The Categories tab
      • The Output tab
    • How to create a rule 🚀
      • Add metadata 🚀
      • Select ACTIONS
      • Select OUTPUTS
      • Policy for the selection of metadata rules
    • Configuration of the environment in Data One
      • Set up Storage Classes
      • Set up Retention Classes
      • Configure the Actor
      • Set up File Resources
    • Associate the Routing Rule with a Contract
    • Example
  • 💬LOGS & AUDIT
    • Logs 🚀
      • Logs options 🚀
      • Troubleshooting error analysis in Logs
    • Audit Options 🚀
      • Export audit logs 🚀
      • List of Audit entity types 🚀
      • Audit message codes 🚀
    • Log Notifiers 🚀
      • FEL message codes
  • 📩NOTIFICATION CHANNELS
    • What are Notification Channels
    • Configuring the default Email Notification Channel
    • Configuring a new Email Notification Channel
    • Trusting Certificates
    • Managing Templates
      • Data Watcher Macros
      • Contract Macros
      • ICAP Macros
      • Central Log Macros
      • Email Templates
      • Editing default templates
      • Loading a new template
  • 🟣DATA MOVER + DATA WATCHER
    • Data Mover in a bundle with Data Watcher
    • Attributes 🚀
    • Cut-off Board
      • Cut-off Calendars
    • Dataflow Inquiry
  • 🟠DATA MOVER + DATA SHAPER
    • Data Mover in a bundle with Data Shaper
    • Monitoring
    • Execution History
    • Sandboxes
  • 💻API
    • HTTP MFT Rest API
    • Job Manager APIs 🚀
    • SFTP Server sessions APIs 🚀
    • Audit Logs APIs 🚀
  • 🧐HOW TO...
    • ... use different DNS names - NEW! 🚀
    • ... configure a Cron Expression
    • ... configure an Application
    • ... customize a header
    • ... run searches in Data Watcher 🚀
    • ... use Data Shaper graphs in Data Mover contracts
    • ... modify DMCFG and deploy it
    • ... tune Data One data retention
  • 🗒️RELEASE NOTES
    • Data One 1.20.10
    • Data One 1.20.9
    • Data One 1.20.8
    • Data One 1.20.7
      • Data One 1.20.7.1
    • Data One 1.20.6
    • Data One 1.20.5
    • Data One 1.20.4
    • Data One 1.20.3
    • Data One 1.20.2
    • Data One 1.20.1
    • Data One 1.20.0
Powered by GitBook
On this page
  • What is DFIC
  • DFIC structure
  • System attributes
  • User attributes
  • How to use DFIC
  • Reading System Attributes
  • Reading and Writing User Attributes
  1. Workflows

DataFlow Instance Context (DFIC) 🚀

What is DFIC

The DataFlow Instance Context, or DFIC, is a descriptor associated with the dataflow instance that is created with the dataflow, and is enriched as the dataflow proceeds through its lifecycle along the contract actions of a COA chain.

This section describes the structure and contents of the DFIC, how to read its attributes and how to modify them.

DFIC structure

The DFIC contains two categories of information:

  1. System attributes: they are set by the system. They can be read and not changed during the life cycle of the dataflow.

  2. User attributes: they are a dataset that is created empty in the DFIC. They can be enriched by the user inserting data and reading them within the workflow templates used in the contract actions of a COA chain.

System attributes

When a new dataflow instance is created, the DFIC is initialized with only the Dataflow Instance Identifier (DFIID) and Log Correlation ID (LCID) attributes. As the dataflow proceeds through its lifecycle along the contract actions of a COA chain, other system attributes may be set automatically (e.g., file provenance information for pull and upload actions for SFTP, FTP, FTPS, HTTP and HTTPS).

System attribute
Description

DFIID

Dataflow Instance ID. Unique identifier to identify all Data Mover integration flows

LCID

Log Correlation ID. Unique identifier for a Data Mover work session

originalcluster

Data Mover STENG cluster name

original_peer

Data Mover STENG peer

original_role

Transfer mode, it can be either "CLIENT" or "SERVER" depending on the role of Data Mover in the transfer

original_server

The server name

original_client-connection

When role is client, the client connection name

original_protocol

Transfer protocol, both for server and role scenarios

original_transferUser

Transfer user name, both for server and role scenarios

original_connection-contract

When role is server, the name of the Connection Contract used to upload the file

original_contract

The name of the Contract, when Data Mover role is client

original_action

The name of the Contract Action, when Data Mover role is client

original_actor

The name of the Actor

original_actor-filename

File name on the Actor's system this is the name of the file in the source system; it might differ from the original_filename. For example, when pulling a file "foo" from an SFTP server of an Actor and writing it into Data Mover with a different name "bar"

original_actor-path

Remote path on the Actor's system

original_filename

Name of the file

original_size

Size of the file in bytes

original_corrid

Optional logical label associated to the file for correlation or identification

original_virtualpath

Original Virtual Path where the file has been stored. This is the Virtual Path in Data Mover where the file first landed, representing the initial point of contact with Data Mover

original_vfs

Original Virtual File System where the file has been stored. This is the Virtual File System in Data Mover where the file first landed, representing the initial point of contact with Data Mover

System attributes are preserved with the republish action.

User attributes

DFIC user attributes are contained in the usrAttrs attribute, which internally has a name/value pairs structure.

By default there are no user attributes, and it is up to the user to set them, and subsequently read them, from within a workflow template underpinning a contract action in a COA chain, when required.

Since the DFIC lifecycle corresponds to the associated dataflow lifecycle, setting one or more user attributes in a contract and reading them in another contract executed later for the same dataflow, is an effective way of passing parameters between different contracts of the same COA chain.

For more information on how to read and write user attributes, please refer to the following chapter.

How to use DFIC

DFIC information can be read, and new attributes (user attributes) can be written in Mediation workflows.

User attributes are user-defined attributes that enrich the DFIC and are contained in a structure that enables users to write and read custom attributes.

Both user-defined and system attributes can be used within workflows to create logic for decision-making automations applicable to specific situations.

Get and Set actions are listed below.

Reading System Attributes

To read a system attribute, a specific function must be invoked:

System Attribute
Function

dfiid

DFIC.getDataFlowInstanceId();

lcid

DFIC.getLCID();

modelId

DFIC.getDataflowModelId();

modelName

DFIC.getDataflowModelName();

modelVersion

DFIC.getDataflowModelVersion();

originalTransferUser

DFIC.getOriginalTransferUser();

originalPeer

DFIC.getOriginalPeer();

originalProtocol

DFIC.getOriginalProtocol();

originalServer

DFIC.getOriginalServer();

originalFilename

DFIC.getOriginalFilename();

originalSize

DFIC.getOriginalSize();

originalActorFilename

DFIC.getOriginalActorFilename();

originalCluster

DFIC.getOriginalCluster();

originalCorrId

DFIC.getOriginalCorrId();

originalVirtualpath

DFIC.getOriginalVirtualpath();

originalVfs

DFIC.getOriginalVfs();

originalRole

DFIC.getOriginalRole();

originalActor

DFIC.getOriginalActor();

originalClientConnection

DFIC.getOriginalClientConnection();

originalActorPath

DFIC.getOriginalActorPath();

originalContract

DFIC.getOriginalContract();

originalContractAction

DFIC.getOriginalContractAction();

originalConnectionContract

DFIC.getOriginalConnectionContract();

Reading and Writing User Attributes

In order to create a new user attribute or update an existing user attribute you must set a workflow variable with the prefix "DFIC_userAttrs_" using this workflow script task function:

execution.setVariable("DFIC_userAttrs_<user_attribute_name>", "<user_attribute_value>");

Please notice that user attributes can only have a String data type.

Conversely, when you need to read a previously set user attribute, you must use this script task function:

${DFIC.getUserAttr("<user_attribute_name")}

Example

Let's suppose that you need to logically tag a dataflow with its owner department, early in the COA chain, and use this logical tag later in the chain.

This is how you would initially establish a dataflow_owner_department user attribute and set it to "ACCOUNTING"

execution.setVariable("DFIC_userAttrs_dataflow_owner_department", "ACCOUNTING");

and this is how you would retrieve its value when required:

${DFIC.getUserAttr("dataflow_owner_department")}

PreviousEditing workflow templatesNextIdentity and Access Management

Last updated 10 days ago

🧱