Website Nintex
Nintex helps companies unlock the power of endless possibilities. We believe that by putting intelligent automation in the hands of many, organizations can eliminate friction from work, inspiring teams to create a culture of efficiency that accelerates business success. Efficiency creates momentum, driving people, work, and businesses forward.
About the role:
- You are responsible for ensuring the development of high-quality data engineering products and services, delivered on time and to acceptance criteria.
- You are a respected technical leader within your team and demonstrate ownership and technical excellence. Your advice is sought by your teammates to resolve complex technical problems. Your focus is on meetings your team’s requirements for data engineering solutions and you have vision for developing data engineering trends and how to apply them in novel ways to the most complex on your team
- You will be joining a team with highly-motivated individuals with a strong-growth mindset – professionally and personally! The team is currently working on rolling out a new Data Platform for the entirety of Nintex by utilizing the latest cloud technologies.
- This role is hybrid in Johannesburg.
Your contribution will be:
- You identify and pursue opportunities to create or improve organizational data standards.
- You design and implement data engineering solutions that adhere to Nintex guidelines in support of security, disaster recovery, scalability, availability, reliability, and durability.
- You design, code, and unit test data engineering components and features that can be released / deployed through automation.
- You make informed recommendations about the data engineering technology appropriate for implementing a solution.
- You improve data quality through testing, tooling and continuous evaluation of the data engineering process and product.
- You work collaboratively with data analysts, data scientists to create valuable insights from complex data.
- You contribute to the support and maintenance of products or services, including liaising with the support team to resolve issues that have been escalated to the product team.
- You apply different data pipeline architectures appropriately to meet product requirements (data warehouse, data lake, data lakehouse, etc.).
- You advise on the implications of different distributed storage and computing technologies when applied to the complex data engineering requirements.
- You contribute to the hygiene processes to improve and harden data delivery within the team.
- You are an active part of the incident management process, including on-call rotation and unblocking technical and operational decisions.
- You work with data providers to document the incoming data stream’s domain boundaries, field datatypes, and term definitions in sufficient detail to enable transformation of the data for end-user consumption.
- You contribute to organizational data standards and architectural patterns and practices for the data domain.
To be successful, we think you need:
- 5+ years experience in data engineering
- Bachelor’s degree in computer science
- Cloud Platform, i.e. AWS / Azure / GCP
- Cloud DBMS, i.e. Databricks / Snowflake / Amazon Redshift / BigQuery
- Distributed Data frameworks, i.e. Hadoop / Spark / Flink
- Programming, i.e. Java / Scala / Python
- SQL, any variant is a good foundation but HiveQL / Spark SQL preferred
- Database modeling, i.e. Snowflake Schema / Star Schema / Data Vault
Additional skills (advantageous):
- Streaming and event-driven systems, i.e. Apache Kafka / Azure EventHubs / Spark Structured Streaming
- Data testing, i.e. unit tests and quality enforcement
- DevOps platform, i.e. Azure DevOps / GitHub
- Scripting, i.e. Bash / Powershell
- Infrastructure as code, i.e. Terraform / ARM templates / AWS Cloudformation