✦ About
data engineer working at the infrastructure layer
designing and operating:
- distributed data pipelines
- api → ingestion → warehouse systems
- real-time + batch processing
- dimensional models + transformation layers
✦ recent work
- api-driven ingestion into snowflake using azure + python
- change data capture systems at scale
- airbyte deployment for continuous replication
- ai-ready semantic model design and optimization
- high-performance analytics with duckdb + parquet
✦ approach
abstraction is useful
until it breaks
understand the system beneath it
✦ tools
python · sql · snowflake · duckdb
fastapi · airflow · airbyte
azure · aws · docker