Evolus logo

Data Engineer

Evolus
Full-time
Remote friendly (Newport Beach, CA)
United States
$114,000 - $142,000 USD yearly
IT

Want to see how your resume matches up to this job? A free trial of our JobsAI will help! With over 2,000 biopharma executives loving it, we think you will too! Try it now — JobsAI.

Role Summary

Data Engineer responsible for integrating data from multiple sources into a common warehouse data model, building and maintaining ELT data pipelines, and contributing to data models in a DataOps environment. Collaborates with business and analytics teams to drive value and support data and analytics initiatives across Evolus's global functions.

Responsibilities

  • Collaborate with team members to collect business requirements, define successful analytics outcomes, and design data models
  • Design, develop Snowflake data warehouse using dbt or other ELT tools to extend the Enterprise Dimensional Model
  • Contribute to planning and prioritization discussions
  • Break down and architect complex data engineering problems to deliver insights that meet or exceed business needs
  • Own and deliver solutions from ingestion of sources to data products for end user consumption, from conceptual iteration to production support
  • Deliver and ensure sustained performance of all data engineering pipelines and remediate as required
  • Own source code management, documentation (technical and end user), and release planning for data engineering products; participate in DataOps, DevOps, and CI/CD to deliver reliable, tested, and scalable functionality through automation
  • Identify and proactively manage risks to the data engineering platform
  • Office location – Newport Beach. Hybrid schedule: Monday and Friday remote; Tuesday - Thursday onsite
  • Other duties as assigned

Qualifications

  • Bachelor’s degree required
  • 6+ years of experience in enterprise data solutions
  • 4+ years in cloud-based data warehousing with strong SQL and Snowflake experience
  • Experience building data pipelines using Python and data orchestration tools like Apache Airflow
  • Experience with data extraction/transformation/orchestration tools such as Fivetran, dbt, Datafold, Prefect, Kafka, Stitch and Matillion
  • Deep understanding of data analysis, data modeling for visualization, and reporting
  • Experience in DataOps and git or Azure DevOps and CI/CD pipelines
  • Demonstrated experience with healthcare, marketing, finance, sales, product, customer success or engineering
  • Experience performing root cause analysis for production issues and identifying opportunities for improvement
  • Strong coding practices with clean, well-documented code and ability to perform code reviews
  • Keen attention to detail in planning, organization, and execution while understanding the big picture
  • Excellent communication and interpersonal skills

Skills

  • Snowflake data warehousing
  • dbt and ELT tooling
  • Python and data orchestration (Airflow)
  • Data modeling, analytics, and visualization
  • DataOps, Git/Azure DevOps, CI/CD
  • Strong collaboration and stakeholder management

Education

  • Bachelor’s degree required
Apply now
Share this job