Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Bolo.

Table of Contents

Search

  1. Introduction to Databricks Delta Connector
  2. Connections for Databricks Delta
  3. Mappings and mapping tasks with Databricks Delta connector
  4. Databricks Delta pushdown optimization (SQL ELT)
  5. Data type reference

Databricks Delta Connector

Databricks Delta Connector

Configure Spark parameters

Configure Spark parameters

Before you connect to the Databricks cluster, you must configure the Spark parameters on AWS and Azure.

Configuration on AWS

Add the following Spark configuration parameters for the Databricks cluster and restart the cluster:
  • spark.hadoop.fs.s3a.access.key <value>
  • spark.hadoop.fs.s3a.secret.key <value>
  • spark.hadoop.fs.s3a.endpoint <value>
Ensure that the access and secret key configured has access to the buckets where you store the data for Databricks Delta tables.

Configuration on Azure

Add the following Spark configuration parameters for the Databricks cluster and restart the cluster:
  • fs.azure.account.oauth2.client.id.<storage-account-name>.dfs.core.windows.net <value>
  • fs.azure.account.auth.type.<storage-account-name>.dfs.core.windows.net <value>
  • fs.azure.account.oauth2.client.secret.<storage-account-name>.dfs.core.windows.net <Value>
  • fs.azure.account.oauth.provider.type.<storage-account-name>.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
  • fs.azure.account.oauth2.client.endpoint.<storage-account-name>.dfs.core.windows.net https://login.microsoftonline.com/<Tenant ID>/oauth2/token
Ensure that the client ID and client secret configured has access to the file systems where you store the data for Databricks Delta tables.

0 COMMENTS

We’d like to hear from you!