
97%
reduction in
time-to-value
Unlimited
scalability aligned resource
usage with business demands
Optimized
resource allocation
for computing resources
Challenge
Facing increased demand for data, Old Mutual encountered a substantial backlog of data delivery, spanning months.
Existing ETL pipelines coded in Pyspark and Python led to lengthy development cycles, causing delays in delivering solutions to the business.
The data team required a sustainable approach to manage technical debt and an efficient way to ramp up their data talent. Data processing requirements with optimized processing times needed to coexist seamlessly within their AWS stack.
Facing increased demand for data, Old Mutual encountered a substantial backlog of data delivery, spanning months.
Existing ETL pipelines coded in Pyspark and Python led to lengthy development cycles, causing delays in delivering solutions to the business.
The data team required a sustainable approach to manage technical debt and an efficient way to ramp up their data talent. Data processing requirements with optimized processing times needed to coexist seamlessly within their AWS stack.
Solution
Old Mutual accelerated pipeline development, constructing 18 tables in less than two weeks. Matillion's template capabilities streamlined development further by enabling code reuse. The platform's extensive library of transformation components, including calculations, pivot and union, significantly reduced development time and promoted reusability.
Universal connectivity facilitated seamless integration with various databases, applications, and APIs, while seamless integration with AWS services like GLUE, EMR, SQS, and Cloudwatch enhanced overall capabilities.
Old Mutual accelerated pipeline development, constructing 18 tables in less than two weeks. Matillion's template capabilities streamlined development further by enabling code reuse. The platform's extensive library of transformation components, including calculations, pivot and union, significantly reduced development time and promoted reusability.
Universal connectivity facilitated seamless integration with various databases, applications, and APIs, while seamless integration with AWS services like GLUE, EMR, SQS, and Cloudwatch enhanced overall capabilities.
Results
- Data processing and solution delivery was significantly optimized- reducing processing time from days to hours.
- 40 tables, including 8 tables with almost 1000 fields, processed in less than 2 hours.
- Solutions that used to take considerable time to develop and deploy were now delivered instantly.
- What used to take 5 hours of processing time could now be completed in less than 30 minutes.
- Optimized resource allocation for computing resources.
- Unlimited scalability aligned resource usage with business demands whilst maintaining cost-effectiveness.
- Data processing and solution delivery was significantly optimized- reducing processing time from days to hours.
- 40 tables, including 8 tables with almost 1000 fields, processed in less than 2 hours.
- Solutions that used to take considerable time to develop and deploy were now delivered instantly.
- What used to take 5 hours of processing time could now be completed in less than 30 minutes.
- Optimized resource allocation for computing resources.
- Unlimited scalability aligned resource usage with business demands whilst maintaining cost-effectiveness.
Featured Resources
Data Mesh vs. Data Fabric: Which Approach Is Right for Your Organization? Part 3
In our recent exploration, we've thoroughly analyzed two key ...
eBooks10 Best Practices for Maintaining Data Pipelines
Mastering Data Pipeline Maintenance: A Comprehensive GuideBeyond ...
NewsMatillion Adds AI Power to Pipelines with Amazon Bedrock
Data Productivity Cloud adds Amazon Bedrock to no-code generative ...