# Row Denormaliser

## <img src="/files/TagQcIm49DuOjmYehz6E" alt="" data-size="line"> Row Denormaliser

### Description <a href="#description" id="description"></a>

The De-normalizer transform allows you de-normalize data by looking up key-value pairs, with the option to convert data types in the process.

Note: make sure to check the notes on this transform in the [Getting started with Beam](/data-shaper-1.21/knowing-the-data-shaper-designer/pipelines/getting-started-with-apache-beam.md) documentation.

| Hop Engine | <sup>✓</sup> |
| ---------- | ------------ |
| Spark      | ?            |
| Flink      | ?            |
| Dataflow   | ?            |

### Options

|                |                                                                                                                                                                                                                                                                                                                                                                      |
| -------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Transform name | Name of the transform. This name has to be unique in a single pipeline.                                                                                                                                                                                                                                                                                              |
| Key field      | The field that defined the key of the output row.                                                                                                                                                                                                                                                                                                                    |
| Group fields   | Specify the fields that make up the grouping here.                                                                                                                                                                                                                                                                                                                   |
| Target fields  | Select the fields to de-normalize by specifying the String value for the key field (see above). Options are provided to convert data types. Strings are most common as key-value pairs so you must often convert to Integer, Number or Date. If you get key-value pair collisions (key is not unique for the group specified) specify the aggregation method to use. |

### Metadata Injection Support

You can use the Metadata Injection supported fields with ETL Metadata Injection transform to pass metadata to your pipeline at runtime. All fields can be injected, the values used for the aggregation field are the following

|                           |                                     |
| ------------------------- | ----------------------------------- |
| key                       | value                               |
| TYPE\_AGGR\_NONE          | No Aggregation is done              |
| TYPE\_AGGR\_SUM           | Sum all values                      |
| TYPE\_AGGR\_AVERAGE       | Calculate the average               |
| TYPE\_AGGR\_MIN           | Take the minimal value of the group |
| TYPE\_AGGR\_MAX           | Take the maximum value of the group |
| TYPE\_AGGR\_COUNT\_ALL    | Count rows                          |
| TYPE\_AGGR\_CONCAT\_COMMA | Aggragate values separated by comma |

### Example

#### Input data

The input data must be ordered by the grouping keys (**RecordID** in this example), use a [Sort rows](/data-shaper-1.21/knowing-the-data-shaper-designer/pipelines/transforms/sort.md) transform if needed:

| RecordID    | key       | value         |
| ----------- | --------- | ------------- |
| 345-12-0000 | FirstName | Mitchel       |
| 345-12-0000 | LastName  | Runolfsdottir |
| 345-12-0000 | City      | Jerryside     |
| 976-67-7113 | FirstName | Elden         |
| 976-67-7113 | LastName  | Welch         |
| 976-67-7113 | City      | Lake Jamaal   |
| 824-21-0000 | FirstName | Rory          |
| 824-21-0000 | LastName  | Ledner        |
| 824-21-0000 | City      | Scottieview   |

#### Denormalized data

Set **The key field** = "key" and add **RecordID** in **The fields that make up the grouping**. Compile the **Target fields** table as follows:

| Target fieldname | Value fieldname | Key value | Type   |
| ---------------- | --------------- | --------- | ------ |
| FirstName        | value           | FirstName | String |
| LastName         | value           | LastName  | String |
| City             | value           | City      | String |

The result is:

| RecordID    | FirstName | LastName      | City        |
| ----------- | --------- | ------------- | ----------- |
| 345-12-0000 | Mitchel   | Runolfsdottir | Jerryside   |
| 976-67-7113 | Elden     | Welch         | Lake Jamaal |
| 824-21-0000 | Rory      | Ledner        | Scottieview |


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.primeur.com/data-shaper-1.21/knowing-the-data-shaper-designer/pipelines/transforms/rowdenormaliser.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
