- Develop and maintain ETL workflows for efficient data processing and integration.
- Design and implement data pipelines using Python, Databricks, and Snowflake.
- Manage and optimize ETL processes to ensure data accuracy and reliability.
- Collaborate with cross-functional teams to identify data needs and deliver effective solutions.
- Monitor and troubleshoot data pipeline performance, proactively addressing any issues.
- Document data integration processes and workflows for clear communication across teams.
- Stay updated on industry trends and emerging technologies to enhance our data solutions.







