Santam Ltd has a vacancy within the Group Underwriting division for a Data Engineer. The role reports to the Head: Data Analytics and will ideally be based in Belville, Cape Town.
What will make you successful in this role?Qualifications & ExperienceBachelor's degree in computer science, Data Engineering, Information Systems, Science/Statistics/Applied Statistics/Applied Mathematics (Postgraduate preferable)5-7 years practical experience in Data engineering, BIBA, ETL development, analytical/quantitative environment or any other relevant field.SkillsStrong programming skills (SQL essential, SAS, Python and/or R highly desirable, Spark, Java)Data Pipeline Development: Design, build, and maintain efficient, scalable data pipelines using SAS, SQL, and cloud technologies. Ensure smooth extraction, transformation, and loading (ETL) of data from multiple sources into Snowflake or other data repositories.Data Modeling: Create and optimize data models to support underwriting analytics and machine learning initiatives using tools such as SAS Viya and SQL.Cloud Data Integration: Implement and manage cloud-based data solutions, particularly with Amazon S3 and Snowflake, ensuring seamless integration of cloud services for data storage, processing, and analysis.SAS Viya Expertise: Leverage SAS Viya to process and analyze large datasets and support data-driven decision-making processes across the organization.Software Development Lifecycle (SDLC): Apply best practices in SDLC to manage data engineering projects, including version control, testing, and continuous integration (CI) and deployment (CD) processes.Automation and Optimization: Automate data processing workflows and optimize ETL pipelines for performance, ensuring reliable and timely delivery of data to users and applications.Collaboration: Work closely with cross-functional teams, including data scientists, business analysts, and stakeholders to understand business needs and translate them into technical solutions.Data Governance and Quality: Implement data governance frameworks to ensure data integrity, quality, and compliance. Regularly monitor and maintain data pipelines to prevent errors and ensure high availability of data.CommunicationProblem-solvingCreativityClient FocusDrives resultsFlexible and adaptableIT Data AnalysisData CollectionAdvanced analytics to address business requirements
#J-18808-Ljbffr