Senior Data Engineer

وصف الوظيفة

Job Purpose

Drive the design deployment and optimization of data pipelines to handle a wide variety of structured and unstructured data sources with exposure to latest data platform technologies on the AWS Eco-System. Implement data ingestion, continuous integration, monitoring & orchestration on cloud for Mondia business entity.

Roles and Responsibilities

·  Engage in collaboration with a cross-functional team of data scientists, data engineers, software developers, and other key stakeholders who work within an agile environment to create data products and enrich Mondia´s data ecosystem.

·  Assist the team in the successful execution, performance optimization of the cloud data warehouse & cost estimation of serverless cloud components.

·  Design, construct, install and maintain data management systems using Spark/PySpark, AWS Glue, Dataflow, or similar cloud ETL Tools.

·  Execute Data orchestration, Workflows & ETL Scheduling Tools like Apache Airflow, luigi & step functions.

·  Recommend different ways to constantly improve data reliability and quality.

·  Employ an array of technological languages and tools to connect systems together.

·  Recommend different ways to constantly improve data reliability and quality.

·  Communicate clearly results & ideas within the team.

·  Communicate effectively to all levels of the organization.

·  Comply with Mondia policies and procedures and support Mondia mission and vision.

·  Perform other job-related duties as assigned by direct manager.

إمتيازات الوظيفة

  1. Opportunity to work for a dynamic international company with a flat hierarchical structure, where your voice matters and your impact is seen.
  2. The company will contribute up to EUR 25 per month towards staff perks
  3. A company bonus scheme applicable as per bonus scheme rules
  4. EUR equivalent salaries paid in EGP

متطلبات الوظيفة

Behavioral Skills:

Accountability and Ownership

Communication

Analytical Thinking

Attention to Details

Result Focus (Delivering Results)

Problem Solving

Relationship Building

Organizational Commitment

Technical Competencies/Skills 

·       Hands-on experience with Glue, Lambda, Step Functions, Redshift, DynamoDB, CloudWatch, and IAM; strong understanding of data lakes, warehouses, and cloud-native architecture on AWS

·       Proficient in building and managing ETL pipelines using Airflow, deploying scalable data services on EC2 and ECS, and leveraging serverless architectures

·       Advanced proficiency in Python, with solid skills in SQL and Shell scripting for automation, data transformation, and workflow management

·       Fluent in English with excellent reporting skills; proven ability to track analytics, monitor KPIs, and translate data into actionable insights

Job Requirements

Education

Bachelor’s degree in Computer Science/Engineering or Statistics.

Experience

+3 years of professional experience in Data Engineering or Data Warehousing with an integrative perspective, from management to operations involvement and hands-on experience with cloud architecture and cloud technologies such as AWS, Azure or Google Cloud Platform GCP.