About Me
- Engineering leader with 9+ years of experience across R&D, ML/AI and Data Platform engineering and 4+ years of experience of leading high-performing teams across data, ML and AI domains.
- Experienced in building and scaling engineering organizations that deliver robust Data Platforms, ML infrastructure, and production-grade systems.
- Adept at leading technical strategy, growing high-performing teams, and partnering with product and research to deliver data-driven solutions at scale. Passionate about fostering a culture of excellence, innovation, and continuous improvement in fast-paced, evolving environments.
Qualifications
- People & Teams Leadership: Cross-Functional Collaboration, Performance Management, Technical Leadership, Goal Setting & Allignment, Engineering Culture Development
- Technical Leadership: System Design, High-Load Architecture, Data Platform Engineering, iData Infrastructure, ML Platforms, MLOps, Reliability Engineering
- Tech Stack & Tools: Python, SQL, Airflow, FastAPI, AWS Athena, dbt, Kafka, Flink, Kubernetes, Terraform, TensorFlow, PyTorch, Kubeflow, Gitlab CI/CD, AWS, GCP
Experience
- Leading 5 engineering teams comprising 30+ engineers across Data Platform, ML and DevOps disciplines.
- Setting strategic direction for engineering and infrastructure projects aligned with company-wide objectives.
- Driving architectural decisions and operational excellence for high-load, real-time data systems.
- Managing team leads, fostering cross-team collaboration, and scaling engineering practices to support high-growth environments.
- Collaborating closely with product, analytics, and executive leadership to prioritize initiatives and deliver business value. Responsible for the technical solution of the whole data platform and services on top of it.
Technologies: AWS, Tableau, Airflow, Spark, Flink, dbt, Python, Java/Scala, FastAPI, Kubernetes, Terraform, Redis, Kafka, Generative AI.
Engineering Manager
May 2022 — Dec 2024
- Leadership of two teams (12 people).
- Designed and built a high-load, distributed recommendation platform, increasing GGR by 8%.
- Deployment of ML models into production as web services, real-time and batch jobs.
- Designed, built and deployed multiple generative AI products (text & image generation) using fine-tuned and diffusion models.
- Served as a course creator and lecturer in the company's Data Engineering Academy for 30+ students. More than half of students become interns within the company.
- Responsible for the technical solution of the core data platform team.
- Designed and built DSL for implementation of Athena Table Permissions based on Terraform, increasing the delivering of role/permissions changes up to 3 times.
- Designed and built DSL for implementation of Row-Level Security based on AWS.
Technologies: AWS, Airflow, Python, FastAPI, Kubernetes, Terraform, Redis, Kafka, Tableau.
Senior Data Engineer
June 2021 — April 2022
- Designed and implemented a data processing platform based on Airflow and AWS instead of legacy DWH based on NiFi and PostgreSQL.
- Implemented DSL for implementation of Transforms in ETL based on AWS Athena.
- Implemented everything-as-a-code.
- In the team, we completely reworked the existing architecture due to rapid business changes.
Technologies: AWS, Airflow, Python, Kubernetes, Terraform, Kafka, Tableau.
SoftServe
https://www.softserveinc.com/Lead Data Engineer
June 2021 — October 2021
Senior Data Engineer
October 2020 — May 2021
Client: Dyson
- Designed and implemented scalable data pipelines to ingest and process advertising data from Facebook, Google, Twitter, Pinterest, and other sources.
- Built reliable aggregates and analytical datasets in BigQuery, enabling marketing and business teams to analyze performance across channels.
- Developed and maintained batch and streaming workflows using Airflow, DataFlow, and Cloud Functions.
- Worked across functional teams (data analysts, marketing, product) to align delivery with business KPIs.
Technologies: Python, SQL, GCP, BigQuery, Airflow, DataFlow, Terraform, Jenkins.
Worked(outstaff/outsource) on client engagements across Data, DS and ML domains.
- Client/Project name: KeplerFI
- Designed and implemented a distributed forecasting service to evaluate and serve multiple time series prediction algorithms (ARIMA, Prophet, custom models).
- Worked closely with data scientists and academic collaborators to translate research prototypes into scalable backend components.
- Created a lightweight visualization dashboard to inspect forecast outputs, model comparisons, and performance metrics.
- Ensured reproducibility, fault tolerance, and parallel processing via asynchronous task queues and containerized services.
Technologies: Python, FastAPI, Kubernetes, Redis, Pandas, NumPy, sklearn, ARIMA, Plotly/Dash.
Worked(outstaff/outsource) on multiple client engagements across Data, DS and ML domains.
- Client/Project name: Kunai Ernst&Young
- Implemented DSL for defining and executing DAG-based workflows with support for parallel task execution.
- Developed scalable REST API for document image processing and text extraction using OCR.
- Developed the service to handle asynchronous requests with support for batch uploads and result polling.
- Contributed to frontend features and UI fixes using Angular.
- Client/Project name: CreatorIQ
- Designed and built multiple applications for automated image generation.
- Developed and maintained data aggregation pipelines using Apache Airflow and Amazon Redshift, enabling internal reporting.
Technologies: Python, C++, Flask, AWS (EKS, Redshift, Lambda), Postgres, MongoDB, Airflow, Kubernetes, Celery, TensorFlow, PyTorch.
- Designed and implemented backend services for key social network features, including user profiles, DMs, group chats, feeds, friend relations, comments, posts, and photo/audio/video content.
- Modeled complex interrelations between data entities using advanced data structures and custom algorithms.
- Optimized database queries and indexing strategies to meet strict non-functional requirements (NFRs) for latency, scalability, and consistency.
- Developed secure APIs for payments, content moderation, and user authentication.
- Worked closely with frontend developers and product stakeholders to ship new features in a fast-paced startup environment.
Technologies: Python, Django, REST API, PostgreSQL, Redis, Celery, Docker, AWS.
- Developed robust data pipelines to collect, clean, and normalize heterogeneous data from public APIs, web scraping, and file-based datasets (CSV, JSON, images).
- Integrated external datasets on crime rates, pollution, transport, education, and more into a geospatial search interface.
- Applied image recognition techniques to extract features (e.g., bus stations, crosswalks) from satellite or street-level imagery.
Technologies: Python, BeautifulSoup/Scrapy, OpenCV, NumPy, Pandas, matplotlib, Postgres/PostGIS, Docker.
Education
Lviv Polytechnic National University
Master’s degree, Applied mathematics
2014 - 2019
Certifications
Google Cloud Certified Proffesional Data Engineer
2020
Languages
Ukrainian: native, English: fluent, Russian: fluent