FTP and HTTP to Redshift integration
- High performance data load
- Simple configuration
- Supports standard and custom objects
- Schedule data load and transformation, combine with other data, processes and services
FTP/ HTTP to Amazon Redshift Load Component
The S3 Put Object component can use a number of common network protocols to transfer data up to an amazon S3 bucket. In all cases, the source data is specified with a URL.
Currently supported protocols are:
When run, the component will upload the contents of the source URL to S3, replacing any existing object with the same name.
Features and Benefits
The S3 Put Object component in Matillion ETL for Amazon Redshift delivers fast data load performance and simple configuration, whilst being extensible to the most sophisticated data load and transform requirements.
- Fast – load and transform data into Redshift at high speed
- Powerful query engine – query standard and custom objects using SQL. Define filters and conditions
- Basic and Advanced modes. Basic mode provides simple data selection and filtering. Advanced mode provides a powerful SQL-like interface to query the database
- Secure –SSL encryption for in-flight data. Authenticate with a database using username/password and token, or OAuth
- Implements Redshift best-practice – automatically sets column encoding settings on data loaded from a database. Specify distribution style (Key, Even or All) and Compound/Interleaved sort keys
- Standalone data load or sophisticated integration – combine database data with data from other databases and systems. Integrate with other AWS services including RDS, S3, SQS and SNS
- Monitor load status. Comprehensive logging, audit and alerting features
- Component documentation (launch in new window)