Post Jobs

Intermediate Data Engineer Job Vacancy at Sanlam Group

Full Time

Website Sanlam Group

Sanlam Group is a diversified financial services company based in South Africa, with a strong pan-African and international presence.

Sanlam Group is a diversified financial services company based in South Africa, with a strong pan-African and international presence. Founded in 1918, it has grown into one of Africa’s largest non-banking financial services groups, offering a comprehensive range of financial products and services. The group’s activities are managed through several clusters, including Sanlam Life and Savings, Sanlam Emerging Markets (now part of SanlamAllianz), Sanlam Investments, Sanlam Corporate, and Santam.

Role Description

  • Guided by Architecture and a Technical Team Lead you will be responsible for establishing new technology components and reusable solution patterns that can be leveraged by business facing development teams in their day-to-day solutions. You will constantly be developing and setting new principles, standards, processes, procedures and guidelines for the wider BI community.
  • You should be able to communicate technical information to technical teams, as well as be competent in communicating challenges and solutions to project and operational leadership. An understanding of data management solutions and a keen sense of the strategic value of information to an organisation will be of importance.
  • You will also be responsible for developing data warehousing blueprints, evaluating hardware and software platforms and integrating systems; translating business needs into long-term architecture solutions.

What will you do?

Data Engineer Role points to include:

  • Working with other data engineers and data modelers, you will design, implement, and manage ETL programmes, data transformations and data pipelines that realise source to target mappings.
  • Monitor and fine-tune data pipelines and data transformations using SQL, Spark, Cloudera Hadoop stack etc.
  • Monitor and fine-tune data vaults, data transformations on the Cloudera Hadoop stack
  • Involvement with system access to the data warehouse to ensure data security and integrity
  • Monitor and support the operational processes and propose optimising solutions

What will make you successful in this role?

Qualifications:

  • Bachelor’s degree in Computer Science, Statistics, Informatics, Information Systems, Engineering or another quantitative field / National Diploma in an Information Technology related discipline preferred

Experience and skills:

  • 3- 5 years related experience
  • Application and data engineering background with a solid background in SQL is required
  • Experience in an operational support role and a passion for improving / optimising processes
  • Data architecture design and delivery experience preferred
  • Experience in three (3) or more of the following areas is required:
  • Database technologies (e.g. SAP Hana, Big Data or similar) and database development (Views, Functions and Stored Procedure development)
  • Hadoop components including HDFS, Hive, Spark, Oozie and Impala
  • ETL tools
  • Data warehousing (Kimball and Data Vault patterns are preferred) and dimensional data modelling

    Share on
    Print

    Similar Jobs