Data Engineer
Unifi
Stellenbosch, Western Cape
Permanent
Apply
Posted 26 August 2025 - Closing Date 09 September 2025

Job Details

Job Description

About Unifi

Unifi is redefining credit in Africa with simple, fast personal loans delivered through online, mobile and branch channels.  We make life easy for thousands of clients across Zambia, Kenya, Uganda and South Africa. Unifi has conviction in the African continent and its people, and our products enable our clients to achieve even more.  As one of the fastest-growing lenders in East Africa, we combine exceptional client service with the very best tech and data analytics.

Learn more about Unifi at:

www.unifi.credit/about

https://www.youtube.com/watch?v=eUrwaPmzU5E

https://www.youtube.com/watch?v=vl32BzxAHfA&t=26s

https://www.youtube.com/watch?v=_Gf1ZiFpfSc&t=188s 

About the role

Unifi is on the lookout for a talented Data Engineer with strong expertise in Google Cloud Platform (GCP) to join our fast-growing team. In this role, you’ll design, build, and maintain scalable data pipelines and architectures that power our business. You’ll collaborate closely with data scientists and analysts to ensure seamless data flow across the organisation, enabling smarter decisions and impactful solutions.

We’re looking for someone who is analytically sharp, self-motivated, and thrives in an unstructured environment. A genuine passion for African business is a must—along with a healthy sense of adventure and a good sense of humour to match our dynamic culture.

Responsibilities

  • Design and build scalable data pipelines and architectures using GCP technologies such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage.
  • Develop and manage ETL processes to transform diverse data sources into clean, structured formats for analysis and reporting.
  • Partner with data scientists and analysts to understand their needs and deliver solutions that enable insights and decision-making.
  • Create and maintain documentation for data pipelines, architecture, and data models to ensure clarity and consistency.
  • Troubleshoot and resolve data-related issues quickly to minimise disruption.
  • Continuously optimise data pipelines for performance, scalability, and cost efficiency.
  • Automate workflows and processes through scripts and tools that streamline operations.
  • Safeguard data quality and integrity across all sources, pipelines, and platforms.
  • Stay ahead of the curve by keeping up with new GCP tools, best practices, and data engineering trends.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field.
  • 5+ years’ experience as a Data Engineer or in a similar role.
  • Strong programming skills in BigQuery, Python, SQL, and GCP.
  • Proven expertise in ETL development and data modeling.
  • Familiarity with data lakehouse concepts and techniques.
  • Excellent problem-solving, analytical, and critical-thinking skills.
  • Strong communication and collaboration abilities.
  • Experience with Google Cloud Platform (GCP) technologies—especially BigQuery, with additional exposure to Dataflow, Pub/Sub, and Cloud Storage—considered highly beneficial.
  • Background in financial services would be an added advantage.