View Advert MiHCM
Back to all jobs
Career Portal Details
Data Engineer
Location
HO/ Corporate Building - 2nd Floor
Closing Date
05/31/2025
Description

Are you passionate about data engineering and eager to work in a dynamic, fast-paced environment? Dialog Axiata PLC is looking for a Data Engineer to join our Group Analytics & AI division. If you have a strong background in ETL operations, data modeling, and BI tool administration, this role is perfect for you!

The Job

  • Design, deploy, and manage Kafka clusters for high-throughput data streaming. 
  • Configure Kafka Connect for data ingestion and integration with various sources. 
  • Ensure scalability, fault tolerance, and monitoring of Kafka pipelines using tools like Confluent Control Center.
  • Build and optimize data models in Snowflake for analytics workloads.
  • Manage Snowflake roles, permissions, and resource monitoring for cost efficiency.
  • Implement data sharing, cloning, and time-travel features for operational efficiency.
  • Develop and maintain ETL jobs using AWS Glue to process structured and unstructured data.
  • Catalog data in AWS Glue Data Catalog for metadata management.
  • Query large datasets using Athena for ad-hoc analysis and reporting.
  • Build and maintain a centralized data lake on AWS (S3) for raw and processed data.
  • Implement partitioning, compression, and security best practices (IAM, encryption).
  • Develop and schedule DAGs in Airflow for orchestrating complex ETL workflows.
  • Monitor and troubleshoot Airflow pipelines for failures and performance bottlenecks.
  • Integrate Airflow with AWS services and external systems for seamless operations.
  • Manage data governance, security, and compliance within the organization.
  • Collaborate with business units and Data Analysts/Scientist to optimize data processing and reporting.
  • Research and adopt emerging technologies to enhance data engineering and Platform practices.
Entry Requirements

The Person

  • BSc in Computer Science/ Engineering / IT or related field.
  • 2-3 years of hands-on experience in Data engineering and platform management. 
  • AWS Associate Architect Certification (mandatory).
  • Proficiency in Confluent Kafka, Snowflake, AWS Glue, Athena, and Apache Airflow.
  • Strong experience with ETL pipeline development and Enterprise Data Lakes (Snowflake, RedShift or AWS S3).
  • Knowledge of Python, SQL, and shell scripting for automation.
  • Familiarity with data modeling, schema design, and performance optimization.
  • Understanding of cloud security, IAM, and cost management best practices.

Additional Preferred Skills

  • Exposure to additional AWS, Azure or GCP Services
  • Experience with CI/CD pipelines (Jenkins, Git)
  • Knowledge of containerization (Docker, Kubernetes).

If you’re ready to push boundaries and exceed expectations, apply now and be part of our exceptional team at Dialog Axiata!

Key Skills
x