Using the Dynamics NAV Query Component in Matillion ETL for Snowflake
Matillion uses the Extract-Load-Transform (ELT) approach to delivering quick results for a wide range of data processing purposes: everything from customer behaviour analytics, financial analysis, and even reducing the cost of synthesising DNA.
The Dynamics NAV Query component in Matillion ETL for Snowflake presents an easy-to-use graphical interface, enabling you to pull data from Microsoft Dynamics NAV directly into Snowflake. Customers are using this to pull their ERP data into Snowflake, to combine with other data from different sources to give a complete view of their business.
The connector is completely self-contained: no additional software installation is required. It’s within the scope of an ordinary Matillion license, so there is no additional cost for using the features.
The first step in configuring the Dynamics NAV Query component is to give Matillion your Dynamics NAV system’s URL. This will allow Matillion to connect into it. Include the port number at the end of the URL:
Next Matillion must have the authentication to access Dynamics NAV. You provide access by authenticating with a Dynamics NAV user. The selected user must have the necessary permission to read the required data. The Username and Password is input into the component:
This is the NAV Server instance name to connect into:
Next you can choose the data you want to load into Snowflake from the Data Source drop down. This is a list of the tables in the Dynamics NAV system:
After choosing the data source, the next step is to choose the required columns from the table in the Data Selection. This will form the new table which is created in Snowflake.
Data Source Filter
You can add a filter to the data you are bringing through into Snowflake using the Data Source Filter.
The driver supports some additional parameters you may want to explore. However, none of the Connection Options are mandatory and the Dynamics NAV driver usually gives you sensible defaults. Should you wish to explore your Connection Options, further details on the options are available here.
Running the Dynamics NAV Query component in Matillion ETL for Snowflake
Before you can run the component, you need to name the Target Table. This is the name of the new table you are creating to write the data into in Snowflake. Finally you need to specify a S3 Staging Area. This is a S3 bucket which will temporarily store the results of the query before the data is loaded into Snowflake.
This component also has a Limit property which you can use to force an upper limit on the number of records returned. We recommend using either a limit or a filter to reduce the number of rows returned and to improve the speed of your job.
You can run the Orchestration job, either manually or using the Scheduler, to query your data and bring it into Snowflake.
The Dynamics NAV Query component offers an “Advanced” mode instead of the default “Basic” mode. In Advanced mode, you can write a SQL-like query over all the available fields in the data model. This is automatically translated into the correct API calls to retrieve the data requested.
Transforming the Data
Once you have brought through the required data from Dynamics NAV into Snowflake, you can use it in a Transformation job, perhaps to combine with existing data:
In this way, you can build out the rest of your downstream transformations and analysis, taking advantage of Snowflake’s power and scalability.