Data Integration Connectors
- Data Integration Connectors
- All Products
Property
| Description
|
---|---|
Connection
| Name of the target connection. Select a target connection or click
New Parameter to define a new parameter for the target connection.
|
Target Type
| Target type. Select one of the following types:
|
Object
| Name of the target object.
|
Create Target
| Creates a target.
Enter a name for the target object and select the source fields that you want to use. By default, all source fields are used. You can select an existing target object or create a new target object at runtime.
You cannot parameterize the target at runtime.
|
Operation
| Defines the type of operation to be performed on the target table.
Select from the following list of operations:
When you use an upsert operation, you must configure the
Update Mode in target details as
Update else Insert .
|
Update Columns
| The fields to use as temporary primary key columns when you update, upsert, or delete data on the Databricks Delta target tables. When you select more than one update column, the
mapping task uses the AND operator with the update columns to identify matching rows.
Applies to update, upsert, delete and data driven operations.
|
Data Driven Condition
| Flags rows for an insert, update, delete, or reject operation based on the expressions that you define.
For example, the following IIF statement flags a row for reject if the ID field is null. Otherwise, it flags the row for update:
IIF (ISNULL(ID), DD_REJECT, DD_UPDATE ) Required if you select the data driven operation.
|
Advanced Property
| Description
|
---|---|
Target Database Name 1 | Overrides the database name provided in the connection and the database selected in the metadata browser for existing targets.
You cannot override the database name when you create a new target at runtime.
|
Target Table Name 1 | Overrides the table name at runtime for existing targets.
|
Update Override Query
| Overrides the default update query that the agent generates for the update operation specified in this field.
Use the merge command for the update operation.
|
Write Disposition
| Overwrites or adds data to the existing data in a table. You can select from the following options:
|
Update Mode 1 | Defines how rows are updated in the target tables. Select from the following options:
|
Staging Location
| Relative directory path to store the staging files.
When you use the unity catalog, a pre-existing location on user's cloud storage must be provided in the Staging Location.
|
Pre SQL
| The pre-SQL command to run before the agent writes to Databricks Delta.
For example, if you want to assign sequence object to a primary key field of the target table before you write data to the table, specify a pre-SQL statement.
|
Post SQL
| The post-SQL command to run after the agent completes the write operation.
You can specify multiple post-SQL commands, each separated with a semicolon.
|
Job Timeout 1 | Maximum time in seconds that is taken by the Spark job to complete processing. If the job is not completed within the time specified, the Databricks cluster terminates the job and the mapping fails.
If the job timeout is not specified, the mapping shows success or failure based on the job completion.
|
Job Status Poll Interval 1 | Poll interval in seconds at which the Secure Agent checks the status of the job completion.
Default is 30 seconds.
|
DB REST API Timeout 1 | The Maximum time in seconds for which the Secure Agent retries the REST API calls to Databricks when there is an error due to network connection or if the REST endpoint returns
5xx HTTP error code.
Default is 10 minutes.
|
DB REST API Retry Interval 1 | The time Interval in seconds at which the Secure Agent must retry the REST API call, when there is an error due to network connection or when the REST endpoint returns
5xx HTTP error code.
This value does not apply to the Job status REST API. Use job status poll interval value for the Job status REST API.
Default is 30 seconds.
|
Forward Rejected Rows
| Determines whether the transformation passes rejected rows to the next transformation or drops rejected rows. By default, the agent forwards rejected rows to the next transformation.
|
1 Doesn't apply to mappings in advanced mode.
|