Job Description
Johannesburg | R900PH | 12-month Contract
This is an excellent opportunity for a Data Engineer to design, build, and optimize large-scale data pipelines and cloud-based data platforms that underpin investment decision-making, risk management, and regulatory reporting
COMPANY
This company is one of South Africa’s largest investment managers, responsible for managing more than R600 billion in assets across both institutional and retail clients. Their focus is on delivering long-term, sustainable returns through innovation, discipline, and a client-first approach. With a strong presence locally and across key African markets, they are investing heavily in modern data platforms and engineering capabilities to strengthen their trading, risk, and analytics functions.
THE ROLE
As a Data Engineer, you will be responsible for building and maintaining the data ecosystem that supports trading applications. Your focus will be on designing ETL/ELT pipelines that ingest data from multiple trading systems (such as Alchemy, Murex, and market data feeds) into Snowflake and enterprise data lakes, ensuring accuracy, performance, and scalability. You’ll work on API-led integrations (leveraging MuleSoft and custom Python/Java services) to enable seamless data flow between applications, while embedding governance and security controls required in a regulated financial environment.
- Design, build, and optimize data pipelines for ingestion, transformation, and delivery of financial datasets.
- Model and manage Snowflake data warehouses and cloud-based data lakes.
- Develop APIs and integration solutions for downstream systems, risk engines, and reporting platforms.
- Monitor and optimize pipeline performance to handle large, complex data volumes for valuations and risk reporting.
- Enforce data governance, lineage, and security policies, including data masking and access controls.
- Collaborate with business stakeholders, analysts, and reporting teams to ensure data meets evolving business and regulatory requirements.
THE REQUIREMENTS
At least 5 years’ experience in Data Engineering (financial services experience highly advantageous).
Strong SQL skills with both relational and NoSQL databases.
Expertise in ETL/ELT pipeline design using tools such as Airflow, Talend, or custom scripting.
Proficiency in Python, Java, or Scala for building integrations and transformations.
Experience with Snowflake or other cloud data warehouses (AWS Redshift, BigQuery).
Solid knowledge of data modeling, data lakes, and dimensional warehousing methodologies.
Exposure to big data frameworks such as Spark or Hadoop.
Familiarity with MuleSoft or other API integration platforms.