Primeur Online Docs
Data Mover 1.20
Data Mover 1.20
  • 🚀GETTING STARTED
    • What is Primeur Data Mover
    • Main features of Primeur Data Mover
    • Primeur Data Mover deployment
    • Navigate through Primeur Data Mover
  • 📚INFRASTRUCTURE
    • STENG, clusters and servers
      • Adding a cluster and a STENG
      • Deleting a STENG
    • DMZ Gateways
      • Blocking users automatically at the DMZ level 🚀
    • DMZ Clusters
      • Load balancing across active DMZ clusters for outbound file transfers 🚀
  • 👥Actors
    • Who are the actors
    • Create your first actor
    • Configure an actor 🚀
      • Users Tab
      • Groups Tab
      • VFS Tab
      • File Resource Tab
      • Connection Contract Tab
      • Client Connections Tab
    • Search files by actor
    • Actor Lineage 🚀
      • Aggregation of flows by protocol 🚀
      • Lineage with connection contracts 🚀
      • Lineage with input, mediation and output contracts 🚀
      • Lineage with any contract type 🚀
  • 🗄️VIRTUAL FILE SYSTEMS
    • Virtual File Systems (VFS) 🚀
      • Creating a VFS 🚀
      • Configuring a VFS
      • Adding Virtual Paths
      • Modifying and Deleting a VFS
    • Searching files in all VFS
    • Storage Classes 🚀
      • Storage Class: SMB v3 or later versions 🚀
      • Storage Class: Azure Blob Storage 🚀
      • Storage Class: Amazon S3 🚀
      • Storage Class: Google Storage 🚀
      • Storage Class: Local File System 🚀
    • Retention Classes
  • 📝Contracts
    • What is a contract
    • Create your first contract
      • Create an Input Contract
        • Define the contract info
        • Associate the contract with the actor
        • Define the contract actions
        • Set the contract variables
      • Create a Mediation Contract
      • Create an Output Contract
      • Create a Connection Contract
        • Create a contract clause
        • Associate the VFS with file processing rules
        • File Processing Rules
    • Managing contracts 🚀
    • File Resources
      • Creating File Resources
      • Navigating File Resources
      • How to use File Resources
  • 🧱Workflows
    • What is a workflow
    • Create your first workflow template
    • Trigger types
      • Trigger types for input contracts
      • Trigger types for mediation and output contracts
    • Service tasks
      • Standard service tasks
      • Triggerable service tasks 🚀
      • Spazio selectors and filebox metadata management
      • Error management
    • Variables
      • Variables in workflows and contracts
      • Handling process variables
    • Workflow templates
      • System workflow templates
        • Workflow templates for input contracts
        • Workflow templates for mediation contracts
        • Workflow templates for output contracts
      • Custom workflow templates
        • Workflow template toolbar
        • Workflow template Shape repository panel
        • Workflow template working area
        • Workflow template BPMN-diagram panel
      • Error workflow templates
    • Editing workflow templates
    • DataFlow Instance Context (DFIC) 🚀
  • 🛸TRANSPORT PROTOCOLS AND CONNECTORS
    • Data Mover client and server roles
    • Client Connections
      • Client Connection: FTP
      • Client Connection: FTPS
      • Client Connection: SFTP
      • Client Connection: HTTP
      • Client Connection: HTTPS
      • Client Connection: PESIT
      • Client Connection: SMB v3 or later versions
      • Client Connection: POP3 or IMAP
      • Client Connection: SMTP
      • Client Connection: PR4/PR4S
      • Client Connection: PR5
      • Client Connection: PR5S
      • Client Connection: HDFS
      • Client Connection: HDFSS
      • Client Connection: Amazon S3 🚀
      • Client Connection: Google Cloud Storage
        • Credentials
      • Client Connection: Azure Blob Storage
      • Client Connection: IBM Sterling Connect:Direct
      • Appendix
    • Server Connections 🚀
      • Server Connection: FTP
      • Server Connection: FTPS
      • Server Connection: SFTP
      • Server Connection: HTTP
      • Server Connection: HTTPS
      • Server Connection: PeSIT
      • Server Connection: PR4
      • Server Connection: PR5
      • Server Connection: PR5S 🚀
      • Server Connection: IBM Sterling Connect:Direct
    • Stopping all servers in one go
  • 💻API
    • HTTP MFT Rest API
    • Job Manager APIs 🚀
    • SFTP Server sessions APIs 🚀
    • Audit Logs APIs 🚀
  • 🔓Security
    • Identity and Access Management
    • Users & Groups
      • Setting the password policy
      • Creating Internal Users 🚀
      • Creating Internal Groups
      • Creating External Users
      • Creating External Groups
    • Key Stores and Trust Stores
      • Key Store 🚀
        • Creating a Key 🚀
        • Creating a Certificate 🚀
        • Importing a Key or a Certificate
        • Creating a Symmetric key
        • Examples
      • Trust Store 🚀
        • Importing Keys 🚀
        • Importing Certificates
      • Untrusted Cache 🚀
      • Trusting Keys and Certificates
      • PGP Key Store and PGP Trust Store
        • PGP Key Store
        • Importing keys into the PGP Trust Store
    • ICAP
      • Configuring ICAP
      • Defining an ICAP rule
  • 🎧FILE EVENT LISTENER
    • What is the File Event Listener
    • Configuring File Event Listeners
      • Setting the File Event Listener Engine
      • Defining a contract for the File Event Listener
      • Setting events to be monitored 🚀
    • RegEx Rules 🚀
    • Monitoring File Event Listeners
  • 👑FILE MANAGER
    • What is the File Manager
    • Logging into File Manager
    • Managing the File Manager 🚀
      • The list of results
      • Creating new folders
      • Uploading files
      • Downloading files 🚀
      • Searching for files and folders
      • Deleting files 🚀
      • Bulk actions 🚀
    • File Manager and VFS
    • Customizing File Manager externals
      • The configuration-wui.json file 🚀
      • How to customize the Login window and the logo
      • How to customize the footer
      • How to configure the Upload with Metadata option
      • How to customize bulk actions 🚀
  • 🧑‍⚖️FILE ROUTING
    • What is File Routing 🚀
    • Routing Rules
      • The Rules tab
      • The Categories tab
      • The Output tab
    • How to create a rule 🚀
      • Add metadata 🚀
      • Select ACTIONS
      • Select OUTPUTS
      • Policy for the selection of metadata rules
    • Configuration of the environment in Data One
      • Set up Storage Classes
      • Set up Retention Classes
      • Configure the Actor
      • Set up File Resources
    • Associate the Routing Rule with a Contract
    • Example
  • 📩NOTIFICATION CHANNELS
    • What are Notification Channels
    • Configuring the default Email Notification Channel
    • Configuring a new Email Notification Channel
    • Trusting Certificates
    • Managing Templates
      • Data Watcher Macros
      • Contract Macros
      • ICAP Macros
      • Central Log Macros
      • Email Templates
      • Editing default templates
      • Loading a new template
  • 💬LOGS & AUDIT
    • Logs 🚀
      • Logs options 🚀
      • Troubleshooting error analysis in Logs
    • Audit Options 🚀
      • Export audit logs 🚀
      • List of Audit entity types 🚀
      • Audit message codes 🚀
    • Log Notifiers 🚀
      • FEL message codes
  • 🕒MONITORING
    • Jobs
      • Details about Jobs 🚀
      • jobman.sh CLI
    • Job Manager
    • Job Queues
      • Managing Job Queues
    • File Transfers
      • Ongoing
      • Finished
      • Reports
    • File Transfers Rules
      • Configuring Rules
  • 🧐HOW TO...
    • ... use different DNS names - NEW! 🚀
    • ... configure a Cron Expression
    • ... configure an application
    • ... customize a header
    • ... run searches in Data Watcher 🚀
    • ... use Data Shaper graphs in Data Mover contracts
    • ... modify DMCFG and deploy it
    • ... tune Data One data retention
    • ... fine tune Data Mover
  • 🗒️RELEASE NOTES
    • Data One 1.20.10
    • Data One 1.20.9
    • Data One 1.20.8
    • Data One 1.20.7
      • Data One 1.20.7.1
    • Data One 1.20.6
    • Data One 1.20.5
    • Data One 1.20.4
    • Data One 1.20.3
    • Data One 1.20.2
    • Data One 1.20.1
    • Data One 1.20.0
Powered by GitBook
On this page
  • Macro steps
  • Examples
  • Input Contract reading a table from a database and producing a .csv file saved in DataOne
  • Mediation Contract copying a binary file from an input VFS to an output VFS
  1. HOW TO...

... use Data Shaper graphs in Data Mover contracts

Previous... run searches in Data Watcher 🚀Next... modify DMCFG and deploy it

Last updated 3 days ago

Macro steps

Assuming that any type of Contract with any variable can be created, let's see the main steps that must be applied to use a Data Shaper graph in Data Mover:

  1. In Data Mover, the workflow with the Data Shaper Processor brick must exist. You can either use system workflow templates (useful and ready to use) or create ad-hoc workflows.

    • For details, go to .

    • If you create an ad-hoc workflow, these 2 properties must be valued for the Data Shaper Processor brick to work correctly:

      • Cluster: you can either enter the Cluster name or create a specific variable (whose Type must be Cluster).

      • Payload: you can either enter the Payload name or create a specific variable (whose Type must be Datashaper payload).

    • At this level, the Graph that will be executed is not specified. It will be defined when creating the Contract as attribute of the Payload variable.

  2. In Data Shaper Designer or Data Shaper Server, define a Sandbox and one or more Graphs.

    • For details, go to .

  3. In Data Mover, create a Contract with an Action that uses the Sandbox and the Graph.

    • For details, go to .

    • You can create Input, Output and Mediation Contracts performing infinite actions. Some examples?

      • an Input Contract to read from a database

      • an Output Contract to write to a database

      • a Mediation Contract to copy a binary file from an input VFS to an output VFS

      • a Contract to read a text file from an input VFS, add a TimeStamp and push the new file to an output VFS

      • a Mediation Contract to convert a .csv input file to a .xlsx output file

  4. When the Contract is ready, as soon as the action is triggered by the expected event, you can check results in the Jobs section and in the Data Shaper Server Console. Moreover, logs can be checked in Monitoring → Logs selecting the Data Shaper Engine module.

Examples

Let's now see 3 examples of Contracts using a Data Shaper graph.

Input Contract reading a table from a database and producing a .csv file saved in DataOne

Prerequisites:

  • the root node must be a DBReader and the leaf node a DataOneVFSWriter

Do not transform and/or map BLOB (byte) fields!

  • the graph must be designed in Data Shaper - you can see an example of the graph in this picture:

  • the connection to the database is configured in the graph itself - it's a JDBC connection

These are the parameters configured:

They will be proposed during the configuration of the contract (see below) and will be used in the graph engine.

In Input Contracts, the system_time_DS-pull and system_ondemand_DS-pull system workflow templates must be used to execute Data Shaper graphs to connect and read from a database. So, let's design an Input Contract.

  1. In Data Mover, go to Design → Contracts.

    • In the New Contract window, assign a Name and a Description to the Contract and select Input in the Contract Type drop-down list.

  2. Click Continue.

  3. In the Actor drop-down list, select the Actor of your Contract and click Continue.

  4. In the Actions section, fill in these fields:

    • Action name: you can leave Input-1.

    • Workflows: since you are defining a Data Shaper Input contract, you must select a Data Shaper system template, either system_time_DS-pull or system_ondemand_DS-pull. Note that entering DS in the Search edit box, the 2 Data Shaper system templates will be listed. In this example, we go for the system_ondemand_DS-pull template.

  1. Save your Contract.

  2. In the New Contract page, fill in these options:

    • Cluster: select the cluster where Data Shaper will execute the graph. The selection of the Cluster will enable the Sandboxes list.

    • DataShaper graph variables: select the Data Shaper Sandbox among those proposed in the Sandboxes dropdown list. Sandboxes are created, configured and listed in Monitoring → Data Shaper. Select the Data Shaper graph among those proposed in the Graphs valid for INPUT Contract dropdown list. The additional fields that appear are the same shown in the Prerequisites: VFS and VFS path, filename output, charset, SQL query and line split. If needed, these options can be edited.

You can now add another Action or save your Contract.

As soon as the action is triggered by the rest API, the content of the MDLR_CONFIGURATION table will be read and a test.csv file will be sent to the DS folder of the part1-VFS.

Results can be checked in the Jobs section and in the Data Shaper Server Console. Moreover, logs can be checked in the Logs → Data Shaper Engine module section.

Mediation Contract copying a binary file from an input VFS to an output VFS

Prerequisites:

  • the graph must be designed in Data Shaper - you can see an example of the graph in this picture:

  • these are the parameters configured:

  • the workflow must be defined in Data One.

Let's now design a Mediation Contract:

  1. In Data One go to Design → Contracts.

  2. In the New Contract window, assign a Name and a Description to the Contract and select Mediation in the Contract Type drop-down list.

  3. Click Continue.

  4. In the Workflows drop-down list, select the Data Shaper workflow and click Save.

  5. In the Actions section, fill in these fields:

    • Source Virtual Path: enter Actor, VFS and VFS path

    • Cluster: select the cluster

    • Payload variables: select the Data Shaper Sandbox among those proposed in the Sandboxes dropdown list. Sandboxes are created, configured and listed in Monitoring → Data Shaper. Select the Data Shaper graph among those proposed in the Graphs valid for MEDIATION Contract dropdown list. The additional fields that appear are the same shown in the Prerequisites: FILESET_ID, output virtual path and charset fields will appear. If needed, these options can be edited.

  1. Save your Contract.

The Contract is now ready. As soon as the new file arrives, you can check results in the Jobs section and in the Data Shaper Server Console. Moreover, logs can be checked in Logs → Data Shaper Engine module.

Output Contract reading a .csv file in DataOne and saving it on a database table

Prerequisites:

  • the root node must be a DataOneVFSReader and the leaf node a DBWriter

  • the graph must be designed in Data Shaper - you can see an example of the graph in this picture:

  • the connection to the database is configured in the graph itself - it's a JDBC connection

We have configured these parameters:

They will be proposed during the configuration of the contract (see below) and will be used in the graph engine.

In Output Contracts, the system_vBind_DS-push and system_newfile_DS-push system workflow templates must be used to execute Data Shaper graphs to connect and write to a database. So, let's design an Output Contract.

  1. In Data Mover, go to Design → Contracts.

    • In the New Contract window, assign a Name and a Description to the Contract and select Output in the Contract Type drop-down list.

  2. Click Continue.

  3. In the Actor drop-down list, select the Actor of your Contract and click Continue.

  4. In the Actions section, fill in these fields:

    • Action name: you can leave Output-1.

    • Workflows: since you are defining a Data Shaper Output contract, you must select a Data Shaper system template, either system_vBind_DS-push or system_newfile_DS-push. Note that entering DS in the Search edit box, the 2 Data Shaper system templates will be listed. In this example, we go for the system_newfile_DS-push template.

  1. Save your Contract.

  2. In the New Contract page, fill in these options:

    • Cluster: select the cluster where Data Shaper will execute the graph. Selecting the Cluster will enable the Sandboxes list.

    • DataShaper graph variables: select the Data Shaper Sandbox among those proposed in the Sandboxes dropdown list. Sandboxes are created, configured and listed in Monitoring → Data Shaper. Select the Data Shaper graph among those proposed in the Graphs valid for OUTPUT Contract dropdown list. The additional fields that appear are the same shown in the Prerequisites: fileset_ID, charset, table of the database, line split and date pattern. If needed, these options can be edited.

You can now add another Action or save your Contract.

As soon as the file arrives, its content will be copied in the clone_mdlr_configuration in the date pattern indicated, separated by #.

Results can be checked in the Jobs section and in the Data Shaper Server Console. Moreover, logs can be checked in the Logs → Data Shaper Engine module.

🧐
Create your first workflow template
Create your first contract
Data Shaper graphs