View Advert MiHCM
Back to all jobs
Career Portal Details
Data Engineer
Location
HO/ Corporate Building - 2nd Floor
Closing Date
10/31/2025
Description

At Dialog Axiata, we push boundaries to deliver exceptional performance. We are on the lookout for a Data Engineer to join the Group Analytics and AI division, where you will play a key role in supporting the enterprise data management platforms and delivering high-quality data to enhance insights and drive business impact.

The Job

  • Support ETL operations on big data platforms (AWS Athena, AWS S3, Snowflake), ensuring high data accuracy and SLA adherence.
  • Troubleshoot, monitor, and resolve data quality issues, while collaborating with source owners to ensure on-time availability of data for business units.
  • Gain an end-to-end comprehensive knowledge of the Collibra data management platform
  • Responsible for configuring and supporting users on the data management platform, conducting staff training, and monitoring the system performance.
  • Understand business requirements and data structures to develop complex SQL-based Data Quality Rules to implement in the Data Quality platform.
  • Support and troubleshoot Data Lineage and perform RCA on data issues
  • Support, maintain, and perform progress reporting of Business Glossaries and Data Dictionaries using the data management platform.
  • Develop Tableau data sources and dashboards adhering to industry standards
  • Developments on AWS Lambda to integrate with various APIs to extract and load data
  • Development of stored procedures, pipelines, and data extractions to load data from various AWS services into Snowflake
  • Airflow-based ETL developments and configurations to automate third-party DQ job executions
  • Engage in R&D initiatives to stay aligned with industry’s best practices and latest developments to deliver innovative data solutions
Entry Requirements

The Person

  • Education: BSc in Computer Engineering, IT, or a related field.
  • Tableau development, Data modelling & analytical knowledge.
  • Proficiency in SQL scripting (PL/SQL, T-SQL) and Linux shell scripting.
  • Python programming Knowledge is mandatory, while AWS/API programming/Java will be an advantage.
  • Working knowledge of Airflow, while ETL tools, big data technologies (Hadoop, Impala, Hive) will be an advantage.
  • Exposure to RDBMS, NoSQL, and BI tools (QLIK, SSRS, Power BI).
  • Experience: 8-24 months in a related role; certifications in AWS, DWH, BI, Linux, or data analytics will be an advantage.
  • Good understanding of data quality, metadata, data management systems, and related concepts
  • Good interpersonal, teamwork, and communication skills.
  • Highly motivated, self-driven, able to perform tasks with minimal supervision, and capable of meeting deadlines and targets
  • Prepared to take on challenges, passionate about researching and learning new technologies to stay ahead of the curve.
Key Skills
x