وصف الوظيفة
Job Description
- Collect, clean, and transform structured and unstructured datasets from multiple sources (databases, APIs, data warehouses, and flat files) for analysis.
- Develop, maintain, and optimize SQL queries, stored procedures, and ETL pipelines to ensure reliable data flows.
- Perform statistical analysis, hypothesis testing, and predictive modeling to extract actionable insights and support decision-making.
- Create advanced dashboards and reports using tools such as Power BI, Tableau, or Looker, ensuring KPIs are tracked and visualized effectively.
- Collaborate with data engineers and business stakeholders to define data requirements and ensure alignment between technical outputs and business needs.
- Apply data mining, clustering, and regression techniques to detect patterns, trends, and anomalies across large datasets.
- Document methodologies, maintain reproducibility of analysis, and adhere to best practices for version control and code management (e.g., Git).
Job Requirements
- A degree in computer science, data science or any other relevant field. A master's is a plus
- 4 years of experience in relevant fields
- Data Engineering & Querying: Strong proficiency in SQL
- Visualization & Reporting: Advanced skills in BI tools (Power BI, Tableau, Looker, or equivalent) and ability to design performance-optimized dashboards.
- Data Wrangling: Ability to handle raw, messy data—cleaning, normalizing, feature engineering, and managing large datasets with performance considerations.
- Cloud & Analytics Tools: Familiarity with cloud platforms (GCP BigQuery, AWS Redshift, Azure Synapse) and distributed data systems (Spark, Hadoop) is a plus as well as hands-on experience on Dataiku is a plus.