Post Jobs

Data Engineer Vacancies at Unitrans

Full Time

Website Unitrans

Unitrans Africa is a large, multi-disciplinary services provider with more than 100 years of experience serving a variety of industries across sub-Saharan Africa.

Job Advert Summary

  • We are seeking a Data Engineer with expertise in data architecture and modelling to drive the design and optimisation of our enterprise data platform in a dynamic logistics environment. This role blends hands on engineering with the shaping of the data platform architecture, ensuring scalability, data quality, and strategic alignment with business objectives.
  • The incumbent will be responsible for building and refining enterprise data models that unify information from diverse sources including internal systems, ERP platforms, IoT sensor streams, and third-party data, ensuring that our platform supports high quality, accessible and trusted data.
  • The focus will span across data modelling, integration and architecture, ensuring that the platform can support real-time analytics, predictive insights and machine learning at scale. This includes designing efficient schemas, implementing semantic layers, centralising business logic, and building data pipelines that adhere to a medallion architecture within Azure Synapse Analytics.
  • Furthermore, the incumbent will collaborate with analytics and business stakeholders, translate complex requirements into performant, robust, and future proof data solutions as well as develop indexing and partitioning strategies, optimise data models for query efficiency and contribute to the creation of feature stores that power analytics and machine learning applications.

Minimum Requirements

Experience and Qualification

  • Bachelor’s degree in Computer Science, Information Systems, or a related field OR proven capability in working with enterprise data systems.
  • Experience with cloud data platforms (Azure Preferred).
  • Strong understanding of cloud data solutions and architectures, and data modelling techniques, including dimensional and relational modelling.
  • Azure Data Engineer Associate (DP-203) (Preferred).

Skills and Attributes

  • Highly SQL proficient, with experience in building and optimising complex queries for data modeling.
  • Highly proficient in Python/PySpark and scripting for data operations.
  • Knowledge of open source orchestration tools such as Apache Airflow.
  • Process design and implementation.
  • Proficiency in designing and maintaining data schemas and keys, including primary, foreign, and composite keys.
  • Familiarity with data modeling best practices in cloud data platforms and indexing strategies.
  • Familiarity with on-premises SQL Server databases and familiarity with migration processes to cloud platforms.
  • Familiarity with Qlik Sense or similar reporting and analytics tools.
  • Familiarity with enterprise systems and working with multiple databases.
  • Strong problem-solving skills and a demonstrated ability to learn new technologies independently.
  • Proactive approach to problem solving.
  • Ability to function within various teams and environments, but also work independently.
  • Excellent communication skills.

Duties & Responsibilities

  • Monitor, maintain and improve data pipelines and data models and schemas in Azure Synapse Analytics.
  • Design, develop, and optimise data models in a medallion architecture to support analytics and reporting needs.
  • Work with third party consultants to implement data lake architecture.
  • Build and maintain data pipelines to facilitate the smooth flow of data between various systems and databases.
  • Ensure consistency in data model design, including proper use of keys, indexing, and semantic layers to improve query performance.
  • Provision datasets for consumption side users, focusing on consistency and data integrity.
  • Migrate business logic from Qlik Sense to Azure Synapse Analytics.
  • Migrate on premises data systems to the Azure Cloud.
  • Test and maintain new and existing API connections.
  • Contribute to and expand a data taxonomy to ensure consistency and clarity in data organisation.
  • Implement data security protocols across the entire data environment.
  • Focused improvement initiatives of existing processes, systems, data, reports.
  • Focused improvement initiatives of cross-functional interactions within the organisatin

    Share on
    Print

    Similar Jobs