Role Summary
Senior Data Product Engineer for Allergan Aesthetics Tech. You will collaborate with cross-functional partners to build data products, develop and optimize ETL processes, and implement data governance and quality practices. You will expose data products via APIs and microservices and contribute to architecture decisions to support scalable, secure data platforms.
Responsibilities
- Collaborate with cross functional partners (Product Managers, Data Scientists, Machine Learning Engineers, Software Engineers, Business teams) to build data products.
- Develop, optimize, & maintain complex ETL processes for moving & transforming data.
- Champion code quality, reusability, scalability, security & help make strategic architecture decisions.
- Implement processes & tools to ensure data quality, enforce data governance policies & engineering best practices.
- Develop APIs & Microservices to expose & integrate data products with software systems.
Qualifications
- Required: BS in Computer Science, Mathematics, Statistics, Engineering, Electronics, or other quantitative field or foreign education equivalent; 5 years of experience as a Data Engineer.
- Required: 5 years of experience in programming in Python and/or Java using object-oriented programming principles, building batch and/or streaming pipelines with complex SQL transformations, DataFrame manipulations, and big data technologies.
- Required: experience applying data quality checks and building data monitoring solutions.
- Required: experience leveraging Git to manage complex codebases.
- Required: experience building CI/CD pipelines using at least one of the following tools: GitHub Actions, CircleCI, Jenkins, GitLab CI.
- Required: experience working in a matrixed organization and preparing written and oral presentations to managers, peers, and business stakeholders.
- Required: 4 years - applying relational and dimensional data modeling concepts to build data products; architecting solutions on AWS or equivalent public cloud platforms.
- Required: 2 years - orchestrating complex workflows and data pipelines using Airflow, Step Functions, or DBT; developing & deploying solutions using Docker & Kubernetes; developing data APIs or microservices or event-driven systems to integrate data products with other software systems.
Education
- BS in Computer Science, Mathematics, Statistics, Engineering, Electronics, or other quantitative field or foreign education equivalent.
Skills
- Data engineering
- ETL design and optimization
- SQL, DataFrames, big data technologies
- Python and/or Java
- Git
- CI/CD tooling (GitHub Actions, CircleCI, Jenkins, GitLab CI)
- AWS or equivalent cloud platforms
- Data modeling (relational and dimensional)
- Workflow orchestration (Airflow, Step Functions, DBT)
- Docker & Kubernetes
- API and microservice development