Integrate data from Actian Ingres to Snowflake using Matillion

The Actian Ingres to Snowflake connector streamlines data transfer to Snowflake within minutes, ensuring your data remains current without requiring manual coding or complex ETL scripting.

Actian Ingres
Snowflake
Actian Ingres to Snowflake banner

Extracting data from Actian Ingres to Snowflake

Extracting data from Actian Ingres is a crucial step for organizations seeking to modernize their analytics platforms by leveraging cloud-based solutions such as Snowflake. This article provides a comprehensive guide for data professionals migrating from Actian Ingres, particularly those using Matillion for orchestration. We will begin by outlining how to create an identity within Actian Ingres to ensure secure and authorized data access. Next, the article will guide Matillion users through the process of checking for, or obtaining, the necessary JDBC driver to enable connectivity between source and target systems. We will also address important network connectivity considerations to help you establish a reliable and secure connection from your Actian Ingres environment (the source) to your Snowflake destination (the target). Finally, this article will explain methods for querying and extracting data—covering both initial full loads and ongoing incremental data extraction—to support efficient, up-to-date data integration. By following these steps, you will be prepared to successfully extract and load data from Actian Ingres into Snowflake for modern analytics and reporting.


What is Actian Ingres?

Actian Ingres is a mature, enterprise-grade relational database management system (RDBMS) designed for high-performance OLTP (Online Transaction Processing) and BI (Business Intelligence) workloads. Originating from the University of California, Berkeley, Ingres has evolved over several decades, providing robust support for ANSI SQL, ACID compliance, and a wide range of commercial deployment options across Linux, UNIX, and Windows platforms. Its modular architecture includes features such as MVCC (Multi-Version Concurrency Control), advanced security, role-based access management, and sophisticated query optimization. Ingres integrates natively with modern connectivity standards like ODBC/JDBC, and it is backed by Actian’s enterprise support, making it suitable for mission-critical applications in sectors including finance, telecommunications, and government.

matillion logo x Actian Ingres

What is Snowflake?

Snowflake is a cloud-native data platform offering scalable, multi-cluster support for data warehousing, lakes, and analytics. Its unique separation of storage and compute lets organizations elastically scale resources for optimal performance and cost, without query disruption. With native ANSI SQL and support for semi-structured formats like JSON, Avro, and Parquet, Snowflake streamlines diverse data integration and analysis. It features secure data sharing, minimal management, and seamless deployment across AWS, Azure, and Google Cloud. This makes Snowflake ideal for enterprises needing flexible, governed data collaboration in complex, distributed environments.

Why Move Data from Actian Ingres into Snowflake

The Benefits of Transferring Data from Actian Ingres to Snowflake for Enhanced Analytics

A data engineer or architect may wish to copy data from Actian Ingres into Snowflake for several compelling reasons. Actian Ingres often holds valuable historical and operational data crucial to enterprise analytics. However, the true value of this data is realized when it is integrated with information from other sources, enabling richer analysis, cross-system reporting, and advanced data-driven insights. By transferring data to Snowflake, organizations can capitalize on Snowflake’s scalable and performant environment for data integration and analysis, all while minimizing any additional workload on the Ingres system itself. This ensures that operational systems remain unaffected by heavy analytical queries, and, simultaneously, that data teams benefit from Snowflake’s robust data-sharing, transformation, and processing capabilities.

Creating a User in Actian Ingres Database

To create a user in Actian Ingres, you typically work with existing operating system user accounts or configure security within a directory service such as LDAP. Ingres authenticates users based on their operating system credentials; there is no dedicated CREATE USER command as you might find in other relational database systems.

Below are the recommended steps to enable a new user to connect to an Actian Ingres database and, if required, assign database-specific permissions.

1. Create an Operating System User

The Ingres server relies on the operating system for authentication (unless using external authentication modules like LDAP). Therefore, you must first create a user account at the OS level.

Linux/Unix:

sudo useradd ingres_user
sudo passwd ingres_user

Windows: Create a new user via Control Panel or using the following command in the Command Prompt (as Administrator):

cmd
net user ingres_user somePassword /add

2. Grant Database Permissions

Once the operating system user exists, you may grant access to database objects (schemas, tables, etc.) inside the database.

To grant the user database permissions, logon to Ingres as a user with DBA rights and execute SQL GRANT statements.

Example: ``` -- Grant CONNECT privilege to the database GRANT CONNECT ON DATABASE TO ingres_user;

-- Grant SELECT, INSERT, UPDATE permissions on specific tables GRANT SELECT, INSERT, UPDATE ON employee_table TO ingres_user; ```

Note: Some installations of Ingres do not require an explicit

GRANT CONNECT
statement to allow user connection to the database. Privileges may be managed at the table or schema level as appropriate.

3. (Optional) Configure Roles

Ingres supports roles for managing user privileges as a group. You can create a role and assign it to users.

Example:

CREATE ROLE reporting_role;
GRANT SELECT ON reports TO reporting_role;
GRANT reporting_role TO ingres_user;

4. Login Verification

The user can now access the database using their corresponding OS credentials:

sql ingresdb -U ingres_user
or ``` sql ingresdb

If logged into the OS as ingres_user

```


References: - Actian Ingres Security Guide
- GRANT Statement—Actian SQL Reference

Installing the JDBC driver

At the time of writing, the JDBC driver for the Actian Ingres database is not bundled with Matillion Data Productivity Cloud. This is due to licensing or redistribution restrictions around the use and distribution of the Ingres JDBC driver. As a result, you will need to manually obtain and install the driver before you can connect Matillion to your Ingres instance.

To install the driver into Matillion Data Productivity Cloud, follow these steps:

  1. Download the JDBC Driver

  2. Navigate to the Actian Ingres product page.

  3. Locate the downloads section for the database and select the JDBC driver.
  4. Where possible, choose the Type 4 JDBC driver option as it is pure Java and most compatible with cloud platforms such as Matillion.

  5. Review Licensing Terms

  6. Before downloading and distributing the driver, ensure that you comply with Actian's licensing terms as presented on their website.

  7. Locate the Downloaded Driver File

  8. Once downloaded, you should have a

    .jar
    file which is the driver package. This file will be uploaded to your Matillion agent.

  9. Upload the Driver to Matillion Data Productivity Cloud

  10. Follow the official installation instructions for uploading external JDBC drivers.

  11. In summary, this involves accessing your Matillion agent configuration and uploading the JDBC driver

    .jar
    file. Please refer to Matillion’s documentation for any platform-specific details or prerequisites.

  12. Configure and Use the Driver

  13. When the driver is installed, you may proceed to connect to your Ingres database from within Matillion.

  14. For guidance on using the driver and configuring database queries, consult the Usage instructions on the official documentation.

By following these steps, you can enable JDBC connectivity to Actian Ingres within Matillion Data Productivity Cloud.

Checking network connectivity

To ensure reliable connectivity between Matillion Data Productivity Cloud and your Actian Ingres database, you must configure the database to allow incoming connections from the appropriate sources. The specific configuration required depends on your deployment type:

Full SaaS Agent Configuration:
If you are using the Full SaaS agent, you must allow incoming connections to your Actian Ingres database from the IP addresses listed in the following documentation: Matillion Data Productivity Cloud Allowed IP Addresses. Make sure these IP addresses are permitted through your database server’s firewall and any network security groups.

Hybrid SaaS Deployment:
For Hybrid SaaS deployments, the Matillion agent running in your environment connects to the Actian Ingres database. In this scenario, you must allow incoming connections from your own Virtual Private Cloud (VPC). You can use network access checking utilities found here: Matillion Network Access Utilities to assist with verifying network reachability.

DNS Resolution:
If you are referencing your Actian Ingres database using a DNS name, ensure that either the Full SaaS agent or your Hybrid SaaS agent can resolve this DNS address to the correct IP address. Failure to resolve the DNS name may result in connectivity issues between Matillion Data Productivity Cloud and your database.

Querying Data from Actian Ingres Database

This guide provides technical instructions for querying data from an Actian Ingres database, with an emphasis on typical SQL SELECT statements, data type considerations (e.g., when extracting to Snowflake), and strategies for performing both initial and incremental data loads.


1. Example SQL SELECT Statements in Actian Ingres

Here are typical SQL queries you can run in Actian Ingres:

``` -- Retrieve all columns from the CUSTOMERS table SELECT * FROM customers;

-- Retrieve specific columns with filtering SELECT customer_id, name, city FROM customers WHERE city = 'New York';

-- Join example: ORDERS and CUSTOMERS SELECT o.order_id, c.name, o.order_date FROM orders o JOIN customers c ON o.customer_id = c.customer_id WHERE o.order_date >= DATE('2024-06-01'); ```

Note: Actian Ingres SQL syntax closely follows ANSI SQL, but consult Actian documentation for advanced features.


2. Data Type Conversion to Snowflake

When moving data from Actian Ingres to Snowflake, datatype conversion is important. Some datatype mappings to consider:

Actian Ingres Snowflake Equivalent
INTEGER NUMBER
FLOAT, FLOAT8 FLOAT
CHAR, VARCHAR VARCHAR
DATE DATE
MONEY NUMBER or VARCHAR*
BYTE BINARY

* MONEY: Consider precision and formatting; may require cast/conversion.

Always validate data during ETL to ensure type fidelity.


3. Full and Incremental Loads Using Database Query Component

The best practice for bulk loading data utilizes a two-phase approach, outlined below. For overview and detailed patterns, see: Incremental Load and Data Replication Strategy

3.1 Initial (Full) Load

Perform a once-off full data extraction with a query without a filter, such as:

SELECT *
FROM orders;

  • Usage: All records are retrieved; used to populate target tables initially.
  • Database Query component: No WHERE clause.

3.2 Incremental Load

For subsequent data loads, fetch only records added or changed since the last load (e.g., using a modification timestamp or incrementing ID).

SELECT *
FROM orders
WHERE order_date > '2024-06-01 00:00:00';

  • Usage: Only records with
    order_date
    newer than the last sync.
  • Database Query component: Contains a filter clause (e.g.,
    WHERE order_date > ?
    )
  • Parameterization: Use job variables to dynamically insert the "last run" timestamp or ID.

Example with a variable for incremental filter:

SELECT *
FROM orders
WHERE modified_at > '${PREVIOUS_RUN_TIMESTAMP}';

Reference for technique adoption: Matillion Exchange Article


Note: Use the same Database Query component object for both initial and incremental loads. Modify only the SQL filter for incremental operation. This streamlines maintenance and configuration.

Data Integration Architecture

Loading data into Snowflake in advance of integration offers a practical way to divide and conquer the complexities of data integration by separating it into two distinct phases: first, loading the raw data (Extract and Load), and then transforming it as needed (Transform). This separation is a primary advantage of the ELT (Extract, Load, Transform) architecture, as it enables organizations to handle larger volumes of data more flexibly. Data integration inherently requires transforming disparate datasets into a consistent, usable format—a process best managed through structured data transformation pipelines. With the ELT approach, these transformations and integrations take place directly within the Snowflake database, leveraging its powerful processing engine. As a result, transformations are faster, occur on demand, and are easily scalable according to workload requirements, without the need for investing in separate or expensive data processing infrastructure.

Get started today

Matillion's comprehensive data pipeline platform offers more than point solutions.