With 9 years of experience, I architect and scale robust backend systems, data pipelines, and complex system integrations. I have proven expertise in designing solutions to handle high-volume data processing (over 3-5 GB/s) and manage large-scale daily API requests (150,000+) across multi-cloud environments (AWS, Google Cloud, Azure).
- Backend & Data Engineering: Node.js (JavaScript/TypeScript), Python, Go, Data Pipeline & ETL Architecture, API Design, Microservices, System Integration
- Cloud & DevOps: AWS, Google Cloud, Azure, Docker, CI/CD, Bash Scripting, Multi-Region Architecture
- Databases: MySQL, PostgreSQL, MongoDB, Redis, Elasticsearch
- Frontend: React, AngularJS, HTML/CSS
- High-Throughput Cloud Backend Architecture: Designed and scaled a multi-region backend on AWS to proxy and manage high-volume network traffic, successfully handling sustained loads of over 3-5 GB/s by leveraging Network Load Balancers and private subnets.
- Automated ETL System for Investment Analysis: Engineered an automated ETL system in Node.js to track over 8,000 GitHub repositories, processing more than 150,000 daily API requests to identify and surface high-growth investment opportunities for a venture capital client.
- Multi-Source Data Integration Pipeline: Architected a data integration pipeline using Node.js to consolidate marketing leads from Facebook Ads, Google Ads, and multiple websites into a central MySQL database, enabling comprehensive reporting and analytics in Google Data Studio.
I'm currently available for freelance consulting and full-time remote opportunities. Feel free to reach out via LinkedIn or email.