Pioneers in harnessing the power of Artificial Intelligence (AI) to deliver actionable insights and drive transformative change for businesses worldwide whose and vision is to provide next generation actionable insights to every company, globally, by harnessing the value of data using high performance, interoperable and simplified solutions are currently looking for an AWS Data Engineer to join their fast-paced and dynamic team!
Responsibilities Design, build and operationalise large scale enterprise data solutions and applicationsAnalyse, re-architect and re-platform on-premise data warehouses to data platforms on AWS cloud using AWS or 3rd party services and Kafka CC.Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, PySpark, Scala, Kafka CC.Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programming.Perform detail assessments of current state data platforms and create an appropriate transition path to AWS cloud.Design, implement and support an analytical data infrastructure providing ad-hoc access to large datasets and computing power.Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL, AWS big data technologies and Kafka CC.Create and support real-time data pipelines built on AWS technologies including Glue, Lambda, Step Functions, PySpark, Athena and Kafka CC.Continual research of the latest big data and visualization technologies to provide new capabilities and increase efficiency.Qualifications and experience: Bachelor''s Degree in Computer Science, Information Technology, or other relevant fields5 8 years experienceHas experience in any of the following AWS Athena and Glue Pyspark, DynamoDB, Redshift, Lambda, Step Functions and Kafka CCProficient in AWS Redshift, S3, Glue, Athena, PySpark, Step Functions, Glue Workflows, Kafka CC, Delta.ioKnowledge of software engineering best practices across the development lifecycleExperience with big data tools is a must: Delta.io, PySpark, Kafka, etc.Experience with AWS cloud services: EC2, EMR, RDS, Redshift.The Reference Number for this position is NG59270 which is a Contract, Hybrid position based in Braamfontein, Johannesburg offering a salary of R900k to R1.1mil cost to company per annum