Role Overview
We are seeking a Senior Data Engineer to build and operate enterprise-grade data pipelines using Microsoft Azure as the orchestration layer and Snowflake as the enterprise data lake/warehouse. Focus on reliable ETL/ELT, push/pull data integration patterns, and scalable platform engineering in a GxP/validated environment.
Key Responsibilities
- Design and build batch and incremental data pipelines using Azure-native orchestration services to move data into and out of Snowflake.
- Support push and pull architectures across files, APIs, enterprise applications, and external partners.
- Develop curated, consumption-ready datasets in Snowflake using standardized, reusable patterns.
- Implement data quality and reconciliation controls within pipelines.
- Follow CI/CD and controlled deployment practices across Dev/QA/Prod environments.
- Ensure solutions meet GxP expectations for data integrity, traceability, auditability, and change control.
- Partner with Quality, Validation, and platform teams to support compliance and audit readiness.
- Data Catalog, Metadata & Governance Enablement: publish/maintain data assets, schemas, and lineage; support certified datasets, ownership, and usage transparency.
Requirements
- 10+ years of hands-on data engineering experience.
- Strong experience with Microsoft Azure orchestration/integration services.
- Strong experience with Snowflake (data lake/warehouse).
- Proven ETL/ELT and push/pull integration pattern experience.
- Experience in regulated/validated (GxP) environments.
- Strong SQL and data engineering fundamentals.
- Enterprise data platforms and multi-domain data architectures.
- Familiarity with CI/CD, role-based access control, and secure data sharing.
- Experience with Data Catalog tools and implementation.
- Experience integrating ERP, manufacturing, or other regulated systems.