NEW
Proxify is bringing transparency to tech team performance based on research conducted at Stanford. An industry first, built for engineering leaders.
Learn more
Mohamed K.
Data Engineer
Mohamed is a skilled Data Engineer with expertise in building data pipelines, real-time systems, and cloud orchestration. Proficient in technologies such as BigQuery, Snowflake, dbt, Kubernetes, Kafka, and Airflow, he consistently delivers efficient, scalable, and high-performing solutions.
At Weight Watchers, Mohamed improved ETL reliability by implementing CI/CD pipelines and automated monitoring systems. At Swvl, he developed a fraud detection system that resulted in significant cost savings and automated the migration of databases into BigQuery. Earlier in his career at Eventum IT Solutions, he worked as a full-stack developer, where he enhanced system performance and improved user insights.
With a strong foundation in modern data technologies and a proven track record of innovative problem-solving, Mohamed is a valuable asset for any organization seeking expertise in complex data engineering initiatives.
Main expertise
- Docker 3 years
- Java 2 years
- SQL 4 years
Other skills
- JavaScript 2 years

- React.js 2 years
- MySQL 2 years
Selected experience
Employment
Data Warehouse Engineer
Weight Watchers - 3 years 2 months
- Improved data integration, monitoring, and automation processes by implementing CI/CD pipelines, upgrading service environments, and optimizing ETL workflows.
- Utilized cloud orchestration tools to manage the full data life cycle, from extraction to normalization and injection into Snowflake and Kafka.
- Developed Kubernetes-based monitoring jobs using Datadog, Snowflake, Python, Prefect Dataflow Automation, and PagerDuty to track and resolve data lag and missing records in ETL jobs, significantly improving data reliability and system stability.
- Set up and automated CI/CD pipelines for multiple services using GitHub Actions and Python scripts, enhancing deployment efficiency and reducing downtime.
- Led the upgrade of Python versions across services from 3.9 to 3.11, ensuring compatibility, performance improvements, and security enhancements.
- Migrated from dbt Cloud to dbt Open Source, designing and implementing a scalable, modular architecture for Snowflake’s data warehouse, which included:
- Automated metadata-driven pipelines and robust CI/CD workflows using GitHub Actions and dbt tests.
- Optimized query performance through incremental models, materializations, and strategic partitioning.
- Reduced onboarding time for new data models, improved execution times, conducted team training sessions, and enabled the analytics team to deliver insights faster and more efficiently.
Technologies:
- Technologies:
MySQL
Docker
Python
Apache Kafka
Kubernetes
SQL
DataDog
Snowflake
dbt
Data Engineer
Swvl - 1 year 3 months
- Worked on a real-time engine that captured live changes in data and enabled automated business decision-making.
- Developed ETL data pipeline projects focused on efficiently moving data into the data warehouse.
- Built a real-time fraud detection system that identified and blocked fraudulent users creating multiple fake accounts to exploit promo codes, saving approximately half a million pounds per month. Technologies used included Debezium, Kafka, Flink, and Java.
- Implemented automated data migration pipelines to transfer data from backend databases (MongoDB, PostgreSQL, MySQL) into BigQuery, handled schema changes seamlessly, and ensured data consistency. Technologies used included Debezium, Kafka, Flink, Airflow, Java, Python, and React.js.
Technologies:
- Technologies:
JavaScript
React.js
MongoDB
Docker
PostgreSQL
Redis
Java
Python
Apache Kafka
Kubernetes
SQL
Google Cloud
- Data Engineering
BigQuery
Apache Airflow
Apache Flink
dbt
Sofware Engineer
Eventum IT Solutions - 1 year 4 months
- Developed a Network Management System that delivered services such as network monitoring, performance analysis, configuration management, and remote device control based on specified trigger events.
- Implemented priority-based job scheduling to enhance system throughput and efficiency.
- Developed user activity tracking features to monitor system issues and analyze user behavior and preferences for improved insights.
Technologies:
- Technologies:
MySQL
JavaScript
Redis
Java
Python
Apache Kafka
Spring Boot
Spring
Education
BSc.Computer and Systems Engineering
Alexandria University, Faculty of Engineering · 2015 - 2020
Find your next developer within days, not months
In a short 25-minute call, we would like to:
- Understand your development needs
- Explain our process to match you with qualified, vetted developers from our network
- You are presented the right candidates 2 days in average after we talk
