Job Description
Deloitte is a global leader in audit, assurance, consulting, financial advisory, risk advisory, tax, and related services. Operating in more than 150 countries with over 457,000 professionals, we serve four out of five Fortune Global 500® companies, helping them make an impact that matters.
At Deloitte Consulting, we bring innovation and transformation by combining business strategy and technology expertise. Our AWS-focused team is at the forefront of AI-enabled technologies, data, and cloud solutions—helping clients harness insights and deliver measurable outcomes.
We are seeking a Data Engineer – AWS to join our growing AI & Data practice in Johannesburg.
The Role
As a Data Engineer – AWS, you will be responsible for building and optimising data pipelines, integrating diverse data sources, and enabling scalable data solutions on cloud platforms. You will work closely with technical leads, business stakeholders, and junior team members to design, develop, and implement solutions that transform data into actionable insights.
This role requires a strong background in data engineering, SQL, cloud databases, and ETL tools, with hands-on AWS experience.
Key Responsibilities
Delivery Leadership
-
Define high-level solution designs based on client requirements.
-
Create reusable design standards and patterns.
-
Rapidly prototype potential solutions to support design trade-off discussions.
-
Mentor and train junior team members.
-
Conduct code reviews and provide technical guidance.
-
Estimate tasks accurately and deliver solutions on time.
Engineering & Solution Design
-
Build and optimise data pipelines and integrations across disparate systems.
-
Design data models and architectures aligned with client standards.
-
Translate complex business requirements into scalable solutions.
-
Document technical designs for client product owners.
-
Apply DataOps approaches to solution architecture.
Technical Execution
-
Work with relational and NoSQL databases (SAP Hana, Teradata, SQL Server, MongoDB, DynamoDB, CosmosDB, Hive).
-
Develop database objects: views, functions, stored procedures, indexes, OLAP/MDX queries.
-
Implement ETL solutions using tools such as SSIS, IBM DataStage, Informatica, SAP Data Services.
-
Program with SQL, Python, Java, Spark, Kafka, RabbitMQ.
-
Use big data technologies (Hadoop, HiveQL, Impala, Pig, Oozie, NiFi).
-
Design streaming and real-time data solutions.
-
Apply methodologies such as Agile, PMBOK, DataOps/DevOps.
Requirements
Education
-
Bachelor’s degree in Data Science, Engineering, or a related field (minimum).
-
Postgraduate degree in Data Science/Engineering preferred.
-
Cloud data certifications (AWS, Azure, GCP) preferred.
Experience
-
3–5 years’ professional experience as a data engineer.
-
Strong SQL and database development background.
-
Experience designing and implementing ETL/ELT processes.
-
Hands-on experience with AWS cloud services.
-
Client-facing consulting experience preferred.
Behavioural Competencies
-
Strong communication (written & verbal).
-
Problem-solving mindset with structured approaches.
-
Ability to build client trust and long-term relationships.
-
Team leadership and mentoring ability.
-
Curious, adaptable, and eager to learn new technologies.