Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Bolo.

Table of Contents

Search

  1. Introducing Mass Ingestion
  2. Getting Started with Mass Ingestion
  3. Connectors and Connections
  4. Mass Ingestion Applications
  5. Mass Ingestion Databases
  6. Mass Ingestion Files
  7. Mass Ingestion Streaming
  8. Monitoring Mass Ingestion Jobs
  9. Asset Management
  10. Troubleshooting

Mass Ingestion

Mass Ingestion

Snowflake Data Cloud target properties

Snowflake Data Cloud target properties

When you define a file ingestion task with a Snowflake Data Cloud target, you must enter target options on the
Target
tab of the task wizard.
The following table describes the target options:
Property
Description
Warehouse
Overrides the name specified in the Snowflake Data Cloud connection.
Add Parameters
Create an expression to add it as
Warehouse
,
Database
,
Schema
, and
Target Table Name
parameters. For more information, see Add Parameters.
Database
The database name of Snowflake Data Cloud.
Schema
The schema name in Snowflake Data Cloud.
Target Table Name
The table name of the Snowflake Data Cloud target table.
The target table name is case-sensitive.
Role
Overrides the Snowflake Data Cloud user role specified in the connection.
Pre SQL
SQL statement to run on the target before the start of write operations.
Post SQL
SQL statement to run on the target table after a write operation completes.
Truncate Target Table
Truncates the database target table before inserting new rows. Select one of the following options:
  • True
    . Truncates the target table before inserting all rows.
  • False
    . Inserts new rows without truncating the target table
Default is false.
File Format and Copy Options
The copy option and the file format to load the data to Snowflake Data Cloud.
The copy option specifies the action that the task performs when an error is encountered while loading data from a file:
You can specify the following copy option to abort the COPY statement if any error is encountered:
ON_ERROR = ABORT_STATEMENT
When you load files, you can specify the file format and define the rules for the data files. The task uses the specified file format and rules while bulk loading data into Snowflake Data Cloud tables.
The following formats are supported:
  • CSV
  • JSON
  • Avro
  • ORC
  • Parquet
External Stage
Specifies the external stage directory to use for loading files into Snowflake Data Cloud tables.
Ensure that the source folder path you specify is the same as the folder path provided in the URL of the external stage for the specific connection type in Snowflake Data Cloud.
Applicable when the source for
file ingestion
is Microsoft Azure Blob Storage and Amazon S3. The external stage is mandatory when you use the connection type Microsoft Azure Blob Storage V3, but is optional for Amazon S3 V2. If you do not specify an external stage for Amazon S3 V2, Snowflake Data Cloud creates an external stage by default.
File Compression
Determines whether or not files are compressed before they are transferred to the target directory.
The following options are available:
  • None
    . Files are not compressed.
  • GZIP
    . Files are compressed using GZIP compression.
Applicable for all sources that support the
file ingestion
task except for Microsoft Azure Blob Storage V3 and Amazon S3 V2.

0 COMMENTS

We’d like to hear from you!