NEW
Proxify is bringing transparency to tech team performance based on research conducted at Stanford. An industry first, built for engineering leaders.
Learn more
Ali E.
Data Engineer
Ali is a talented Data Engineer with seven years of experience. He worked in various fields, such as insurance, governmental projects and cloud systems.
He has extensive experience in SQL, ETL, Apache Spark, Python and Airflow. Having worked with international teams and also local governmental projects, Ali has the best of both worlds in his versatile skill set.
Main expertise
- Apache Spark 4 years
- Data Analytics 7 years
- Data Engineering 7 years
Other skills
- Python 3 years

- Apache Airflow 3 years

- Google Cloud 3 years
Selected experience
Employment
Data Engineer
GfK(UK) - 4 years
- Used Airflow, SQL, Python, Bigquery, Cloud Storage, CloudRun
- Working on third-party dataset using GCP
- Create and Maintain DAGs in Airflow 2.3.0 and Create Custom Operators
- Collect Data from third-party APIs like Weather, Covid-19, OECD, etc.
- Create Dashboards Using Streamlit
- Create a self-service pipeline using Gitlab CI/CD to Schedule Jupyter NOTEBOOKS on GCP
Technologies:
- Technologies:
BigQuery
Streamlit
Data Engineer
Coca Cola CCI - 1 year 7 months
- Support Supply Chain Projects
- Import Raw Data from old systems and files to BigQuery
- Create Data Pipelines with Composer(Airflow)
- Write APIs to serve data with flask and deploy to GCP
- Export or Dump data to Bucket or Cloud SQL
- Train AI models using Vertex AI (custom training)
Technologies:
- Technologies:
BigQuery
- Data Analytics
Data Engineer
Tarentum AI - 5 months
● Hepsiburada is one of the biggest e-commerce companies in Turkey
● Create Reports to Hepsiburada’s CFO and Finance Team
● Import Raw Data from RDBMS to Hadoop with Sqoop Scripts
● Create ETL scripts with Hive Query Language
● Read Raw Data from External API(OMS) and Write to Hive Table with Python3
● Export Final Tables to MySql with Sqoop
● Schedule all jobs with Oozie, Hue
● Borusan Predictive Maintenance Project
● Restore Sensor Data from .bak files to Azure SQL Server
● Clean, Transform, and Combine data for Data Scientists with T-SQL
● Store Final Data in Blob Storage(Gen2)
● Mount Gen2 to Databricks File System
● Get quick insights Using Pyspark (Databricks Notebooks)
Technologies:
- Technologies:
Hadoop
Big Data Engineer
TFKB - Turkish Finance Participation Bank - 1 year 6 months
- Setup Hadoop Ecosystems HDP 3.1.0
- Install HDF(3.3) cluster on existing HDP
- Cluster Management (Enable HA, Kerberos)
- Data transfer to Hadoop
- Developing and Tuning Hive Tables
- Data Encryption and Masking (Ranger Policies)
- Check database scripts of over 40 agile development teams
- Define and maintain Naming Standards of Database objects
Technologies:
- Technologies:
Apache NiFi
Hadoop
Data Engineer
Insurance Information and Monitoring Centre - 7 months
- Data transfer RDBMS to HDFS
- Develop new ETL jobs
- Developing and Tuning Hive Tables
- Data and Cluster Security with Ranger
Technologies:
- Technologies:
Hadoop
DWH and BI Specialist
Insurance Information and Monitoring Centre - 9 months
● Setup Hadoop Ecosystems HDP 2.6.3 ● Maintain additional Services like Solr, Banana ● Cluster Management
Technologies:
- Technologies:
Hadoop
DWH and BI Specialist
Insurance Information and Monitoring Centre - 2 years 3 months
● Design, Develop, and Maintenance DWH Architecture
● Create and improve ETL Processes with ODI 11g, ODI 12c
● Develop Business Intelligence Reports with Cognos, Qlikview
● Ad-hoc Reports
Technologies:
- Technologies:
Qlik View
Oracle
- ELT
- Data Analytics
PL/SQL
- Cognos
Education
Standalone courseIndustrial Engineering
Mondragón Unibertsitatea · 2013 - 2014
BSc.Industrial Engineering
Gaziantep University · 2009 - 2014
Find your next developer within days, not months
In a short 25-minute call, we would like to:
- Understand your development needs
- Explain our process to match you with qualified, vetted developers from our network
- You are presented the right candidates 2 days in average after we talk
