Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Bolo.

Table of Contents

Search

  1. Introducing Mass Ingestion
  2. Getting Started with Mass Ingestion
  3. Connectors and Connections
  4. Mass Ingestion Applications
  5. Mass Ingestion Databases
  6. Mass Ingestion Files
  7. Mass Ingestion Streaming
  8. Monitoring Mass Ingestion Jobs
  9. Asset Management
  10. Troubleshooting

Mass Ingestion

Mass Ingestion

Databricks Delta target properties

Databricks Delta target properties

When you define a file ingestion task with a Databricks Delta target, you must enter target options on the
Target
tab of the task wizard.
You can transfer only Parquet files from Amazon S3 V2 source and a Microsoft Azure Data Lake Store Gen2 source to a Databricks Delta target, and all the files must have the same metadata.
The following table describes the target options:
Option
Description
Database
Required. Name of the database in Databricks Delta Lake that contains the target table.
Default value is the database name specified in the Databricks Delta connection.
Add Parameters
Create an expression to add it as
Database
and
Table Name
parameters. For more information, see Add Parameters.
Table Name
Required. Name of the table in Databricks Delta Lake.
If you specify the name of a table that does not exist in target database, the Secure Agent creates a new table with the specified name.
If Table Exists
Determines the action that the Secure Agent must take on a table if the table name matches the name of an existing table in the target database. Select one of the following filter options:
  • Overwrite
  • Append
Default is
Overwrite
.
If a job fails with the following error, see the cluster logs for more information:
"[ERROR] Job execution failed. State : JOB_FAILED ; State Message :"

0 COMMENTS

We’d like to hear from you!