· Create and enhance data solutions enabling seamless delivery of data and is responsible for collecting, parsing, managing, and analyzing large sets of data across different domains for analysis.
· Use different Data warehousing concepts to build a Data warehouse for internal departments of the organization.
· Designs and develops data pipelines, data ingestion and ETL processes that are scalable, repeatable, and secure for stakeholder needs.
· Build Data architecture to support data management strategies to support business intelligence efforts for various stakeholders.
· Develops near real-time and batch ETL data processes aligned with business needs, manages and augments data pipeline from raw OLTP databases to data solution structures.
· Documents data flow diagrams, security access, data quality and data availability across all business systems.
• Bachelor’s degree in computer science, Information Technology or equivalent.
• Knowledge of functions, Fintech, and e-payment
• Proven experience in Linux and Windows Server OS and Oracle/MySQL/MS-SQL, Mongo DB environments.
• Strong experience in Python · Proven experience in ETL development (Apache spark, kafka, airflow, apache impala, Hadoop, hdfs)
• Be able to work with ambitious timelines in a dynamic, high-growth environment · Strong understanding for ODS, DDS and data marts
• Good communications skills, with both technical and business audiences.