You can use an SQL transformation to push supported scalar functions to Databricks Delta.
When you configure pushdown optimization for a mapping, you can use scalar functions in a SQL transformation and run queries with the Databricks Delta target endpoint.
You can use only the SELECT clause SQL statement to push down a function. The following snippet demonstrates the syntax of a simple SELECT SQL query:
You can push an SQL transformation with the following restrictions:
You can configure only an SQL query in the SQL transformation. You cannot enable a stored procedure when you push down to Databricks Delta.
The SQL query must be a simple SELECT statement without 'FROM' and 'WHERE' arguments. The SQL transformation only supports functions with simple SELECT statement.
When you specify a SELECT query, you must also specify the column name and number of columns based on the functions. For example, when you specify the query
select square(~AGE~), sqrt(~SNAME~)
, you must specify two output columns for
AGE
and
SNAME
functions each, otherwise the mapping fails.
If any SQL error occurs, the error is added to the
SQLError
field by default. However, when you run a mapping enabled with pushdown optimization, the
SQLError
field remains as Null.
The
NumRowsAffected
field records the number of rows affected while computing the output buffer. However, for SQL transformation, the
NumRowsAffected
is 0, as the query runs for all the records at the same time.
You cannot include special characters in the query, as SQL transformation does not support special characters in the arguments.
You can use an SQL transformation when the SELECT statement is present only in the query property. You cannot configure an SQL transformation with a parameterized query, as dynamic parameter support is limited, and the query fails with a DTM error.