Hi, I'm Ask INFA!
What would you like to know?
ASK INFAPreview
Please to access Bolo.

Table of Contents

Search

  1. Introducing Mass Ingestion
  2. Getting Started with Mass Ingestion
  3. Connectors and Connections
  4. Mass Ingestion Applications
  5. Mass Ingestion Databases
  6. Mass Ingestion Files
  7. Mass Ingestion Streaming
  8. Monitoring Mass Ingestion Jobs
  9. Asset Management
  10. Troubleshooting

Mass Ingestion

Mass Ingestion

Mass Ingestion Databases architecture

Mass Ingestion Databases
architecture

On each system from which you plan to use
Mass Ingestion Databases
, you must install the Secure Agent.
After you start the Secure Agent the first time, the
Mass Ingestion Databases
agent and packages are installed locally. You then can use the
Mass Ingestion
to configure
database ingestion
tasks and to run and monitor
database ingestion
jobs.
The following image shows the general architecture of
Mass Ingestion Databases
:
Database Ingestion components in the cloud and on prem, in relation to sources and targets.
From the Web-based interface in
Informatica Intelligent Cloud Services
, you can create and manage ingestion tasks and run and monitor ingestion jobs.
The following interactions occur:
  1. When you download the Secure Agent to your on-premises system, the Database Ingestion DBMI packages are also downloaded, provided that you have a license for Database Ingestion. You can then configure the DBMI agent service.
  2. From the
    Informatica Intelligent Cloud Services
    Web-based interface, you define database ingestion tasks.
  3. When you deploy a data ingestion task, a corresponding executable job is created on the Secure Agent system.
  4. When you run a database ingestion job, the ingestion task metadata is pushed down to the Secure Agent. The ingestion job uses this information to process data.
    • For an initial load operation, the ingestion job extracts data at a specific point-in-time from the source tables and fields. The job uses the database API of the relational source to retrieve the data.
    • For an incremental load operation, the ingestion job captures changes, such as inserts, updates, and deletes, for the source tables and fields from the source database logs. Change data capture runs continuously or until the job is stopped or ends.
    The data is loaded to the target using the appropriate target API.

0 COMMENTS

We’d like to hear from you!